Oct 09 07:46:08 crc systemd[1]: Starting Kubernetes Kubelet... Oct 09 07:46:08 crc restorecon[4675]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:08 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 07:46:09 crc restorecon[4675]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 09 07:46:09 crc restorecon[4675]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 09 07:46:09 crc kubenswrapper[4715]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 07:46:09 crc kubenswrapper[4715]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 09 07:46:09 crc kubenswrapper[4715]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 07:46:09 crc kubenswrapper[4715]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 07:46:09 crc kubenswrapper[4715]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 09 07:46:09 crc kubenswrapper[4715]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.834157 4715 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839559 4715 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839579 4715 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839584 4715 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839588 4715 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839593 4715 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839597 4715 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839601 4715 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839605 4715 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839610 4715 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839614 4715 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839618 4715 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839623 4715 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839627 4715 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839632 4715 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839637 4715 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839642 4715 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839647 4715 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839652 4715 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839658 4715 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839665 4715 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839672 4715 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839680 4715 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839687 4715 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839692 4715 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839697 4715 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839702 4715 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839706 4715 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839710 4715 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839714 4715 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839717 4715 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839721 4715 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839725 4715 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839729 4715 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839734 4715 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839737 4715 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839742 4715 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839748 4715 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839752 4715 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839757 4715 feature_gate.go:330] unrecognized feature gate: Example Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839761 4715 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839765 4715 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839769 4715 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839773 4715 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839777 4715 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839781 4715 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839786 4715 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839790 4715 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839793 4715 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839798 4715 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839801 4715 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839806 4715 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839811 4715 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839815 4715 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839820 4715 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839825 4715 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839830 4715 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839835 4715 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839839 4715 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839842 4715 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839846 4715 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839850 4715 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839853 4715 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839859 4715 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839864 4715 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839868 4715 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839871 4715 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839874 4715 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839879 4715 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839884 4715 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839888 4715 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.839891 4715 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840763 4715 flags.go:64] FLAG: --address="0.0.0.0" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840777 4715 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840784 4715 flags.go:64] FLAG: --anonymous-auth="true" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840790 4715 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840796 4715 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840800 4715 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840806 4715 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840811 4715 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840815 4715 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840819 4715 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840824 4715 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840828 4715 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840832 4715 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840836 4715 flags.go:64] FLAG: --cgroup-root="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840843 4715 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840847 4715 flags.go:64] FLAG: --client-ca-file="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840851 4715 flags.go:64] FLAG: --cloud-config="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840855 4715 flags.go:64] FLAG: --cloud-provider="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840859 4715 flags.go:64] FLAG: --cluster-dns="[]" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840865 4715 flags.go:64] FLAG: --cluster-domain="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840870 4715 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840874 4715 flags.go:64] FLAG: --config-dir="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840878 4715 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840882 4715 flags.go:64] FLAG: --container-log-max-files="5" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840889 4715 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840893 4715 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840897 4715 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840901 4715 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840906 4715 flags.go:64] FLAG: --contention-profiling="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840910 4715 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840914 4715 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840918 4715 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840922 4715 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840927 4715 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840932 4715 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840936 4715 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840941 4715 flags.go:64] FLAG: --enable-load-reader="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840946 4715 flags.go:64] FLAG: --enable-server="true" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840951 4715 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840957 4715 flags.go:64] FLAG: --event-burst="100" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840962 4715 flags.go:64] FLAG: --event-qps="50" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840968 4715 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840973 4715 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840978 4715 flags.go:64] FLAG: --eviction-hard="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840984 4715 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840989 4715 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840993 4715 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.840998 4715 flags.go:64] FLAG: --eviction-soft="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841002 4715 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841006 4715 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841010 4715 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841014 4715 flags.go:64] FLAG: --experimental-mounter-path="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841018 4715 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841022 4715 flags.go:64] FLAG: --fail-swap-on="true" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841026 4715 flags.go:64] FLAG: --feature-gates="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841032 4715 flags.go:64] FLAG: --file-check-frequency="20s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841092 4715 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841650 4715 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841657 4715 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841662 4715 flags.go:64] FLAG: --healthz-port="10248" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841667 4715 flags.go:64] FLAG: --help="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841672 4715 flags.go:64] FLAG: --hostname-override="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841677 4715 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841681 4715 flags.go:64] FLAG: --http-check-frequency="20s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841686 4715 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841695 4715 flags.go:64] FLAG: --image-credential-provider-config="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841700 4715 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841704 4715 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841709 4715 flags.go:64] FLAG: --image-service-endpoint="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841713 4715 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841717 4715 flags.go:64] FLAG: --kube-api-burst="100" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841722 4715 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841727 4715 flags.go:64] FLAG: --kube-api-qps="50" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841732 4715 flags.go:64] FLAG: --kube-reserved="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841741 4715 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841745 4715 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841750 4715 flags.go:64] FLAG: --kubelet-cgroups="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841755 4715 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841761 4715 flags.go:64] FLAG: --lock-file="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841765 4715 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841769 4715 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841774 4715 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841782 4715 flags.go:64] FLAG: --log-json-split-stream="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841792 4715 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841797 4715 flags.go:64] FLAG: --log-text-split-stream="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841803 4715 flags.go:64] FLAG: --logging-format="text" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841815 4715 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841821 4715 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841826 4715 flags.go:64] FLAG: --manifest-url="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841832 4715 flags.go:64] FLAG: --manifest-url-header="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841840 4715 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841850 4715 flags.go:64] FLAG: --max-open-files="1000000" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841858 4715 flags.go:64] FLAG: --max-pods="110" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841863 4715 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841868 4715 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841873 4715 flags.go:64] FLAG: --memory-manager-policy="None" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841878 4715 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841884 4715 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841889 4715 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841897 4715 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841910 4715 flags.go:64] FLAG: --node-status-max-images="50" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841915 4715 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841920 4715 flags.go:64] FLAG: --oom-score-adj="-999" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841926 4715 flags.go:64] FLAG: --pod-cidr="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841932 4715 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841943 4715 flags.go:64] FLAG: --pod-manifest-path="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841955 4715 flags.go:64] FLAG: --pod-max-pids="-1" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841961 4715 flags.go:64] FLAG: --pods-per-core="0" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841967 4715 flags.go:64] FLAG: --port="10250" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841973 4715 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841978 4715 flags.go:64] FLAG: --provider-id="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841984 4715 flags.go:64] FLAG: --qos-reserved="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841990 4715 flags.go:64] FLAG: --read-only-port="10255" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841995 4715 flags.go:64] FLAG: --register-node="true" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.841999 4715 flags.go:64] FLAG: --register-schedulable="true" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842006 4715 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842015 4715 flags.go:64] FLAG: --registry-burst="10" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842019 4715 flags.go:64] FLAG: --registry-qps="5" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842027 4715 flags.go:64] FLAG: --reserved-cpus="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842031 4715 flags.go:64] FLAG: --reserved-memory="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842038 4715 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842044 4715 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842051 4715 flags.go:64] FLAG: --rotate-certificates="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842056 4715 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842060 4715 flags.go:64] FLAG: --runonce="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842064 4715 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842070 4715 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842075 4715 flags.go:64] FLAG: --seccomp-default="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842082 4715 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842088 4715 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842099 4715 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842105 4715 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842110 4715 flags.go:64] FLAG: --storage-driver-password="root" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842115 4715 flags.go:64] FLAG: --storage-driver-secure="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842119 4715 flags.go:64] FLAG: --storage-driver-table="stats" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842123 4715 flags.go:64] FLAG: --storage-driver-user="root" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842128 4715 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842132 4715 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842137 4715 flags.go:64] FLAG: --system-cgroups="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842145 4715 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842153 4715 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842157 4715 flags.go:64] FLAG: --tls-cert-file="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842161 4715 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842168 4715 flags.go:64] FLAG: --tls-min-version="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842173 4715 flags.go:64] FLAG: --tls-private-key-file="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842177 4715 flags.go:64] FLAG: --topology-manager-policy="none" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842187 4715 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842192 4715 flags.go:64] FLAG: --topology-manager-scope="container" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842198 4715 flags.go:64] FLAG: --v="2" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842207 4715 flags.go:64] FLAG: --version="false" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842218 4715 flags.go:64] FLAG: --vmodule="" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842227 4715 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.842233 4715 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843840 4715 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843892 4715 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843903 4715 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843912 4715 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843921 4715 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843931 4715 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843939 4715 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843947 4715 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843955 4715 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843963 4715 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843971 4715 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843982 4715 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.843992 4715 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844000 4715 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844008 4715 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844017 4715 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844026 4715 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844034 4715 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844041 4715 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844049 4715 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844057 4715 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844065 4715 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844073 4715 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844081 4715 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844088 4715 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844097 4715 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844104 4715 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844112 4715 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844120 4715 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844128 4715 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844135 4715 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844143 4715 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844184 4715 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844196 4715 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844204 4715 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844212 4715 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844220 4715 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844229 4715 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844236 4715 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844244 4715 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844252 4715 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844260 4715 feature_gate.go:330] unrecognized feature gate: Example Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844270 4715 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844282 4715 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844292 4715 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844300 4715 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844309 4715 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844317 4715 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844325 4715 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844332 4715 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844341 4715 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844348 4715 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844357 4715 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844364 4715 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844372 4715 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844380 4715 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844388 4715 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844396 4715 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844404 4715 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844412 4715 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844445 4715 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844453 4715 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844462 4715 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844470 4715 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844478 4715 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844487 4715 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844495 4715 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844503 4715 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844511 4715 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844519 4715 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.844528 4715 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.844544 4715 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.858099 4715 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.858161 4715 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858292 4715 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858314 4715 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858325 4715 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858336 4715 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858345 4715 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858355 4715 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858363 4715 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858371 4715 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858380 4715 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858388 4715 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858397 4715 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858406 4715 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858414 4715 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858444 4715 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858453 4715 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858461 4715 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858472 4715 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858486 4715 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858496 4715 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858505 4715 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858514 4715 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858526 4715 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858536 4715 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858545 4715 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858554 4715 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858562 4715 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858571 4715 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858579 4715 feature_gate.go:330] unrecognized feature gate: Example Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858587 4715 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858595 4715 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858606 4715 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858616 4715 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858626 4715 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858635 4715 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858647 4715 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858656 4715 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858665 4715 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858673 4715 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858681 4715 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858689 4715 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858696 4715 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858704 4715 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858712 4715 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858720 4715 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858728 4715 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858736 4715 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858744 4715 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858752 4715 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858760 4715 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858767 4715 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858775 4715 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858783 4715 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858792 4715 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858800 4715 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858807 4715 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858815 4715 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858823 4715 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858831 4715 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858840 4715 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858847 4715 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858855 4715 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858866 4715 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858874 4715 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858882 4715 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858890 4715 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858898 4715 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858906 4715 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858914 4715 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858922 4715 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858930 4715 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.858939 4715 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.858952 4715 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859372 4715 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859385 4715 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859394 4715 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859404 4715 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859415 4715 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859453 4715 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859462 4715 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859471 4715 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859479 4715 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859488 4715 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859496 4715 feature_gate.go:330] unrecognized feature gate: Example Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859506 4715 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859514 4715 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859523 4715 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859531 4715 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859538 4715 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859546 4715 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859555 4715 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859563 4715 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859570 4715 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859578 4715 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859586 4715 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859595 4715 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859603 4715 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859610 4715 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859618 4715 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859626 4715 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859635 4715 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859642 4715 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859650 4715 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859658 4715 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859665 4715 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859674 4715 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859681 4715 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859690 4715 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859698 4715 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859706 4715 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859714 4715 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859722 4715 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859730 4715 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859738 4715 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859746 4715 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859756 4715 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859766 4715 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859774 4715 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859782 4715 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859791 4715 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859800 4715 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859811 4715 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859821 4715 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859830 4715 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859839 4715 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859847 4715 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859856 4715 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859867 4715 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859876 4715 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859884 4715 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859892 4715 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859902 4715 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859910 4715 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859918 4715 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859927 4715 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859935 4715 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859943 4715 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859952 4715 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859959 4715 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859968 4715 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859975 4715 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859983 4715 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.859991 4715 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 09 07:46:09 crc kubenswrapper[4715]: W1009 07:46:09.860000 4715 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.860013 4715 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.860318 4715 server.go:940] "Client rotation is on, will bootstrap in background" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.866840 4715 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.867001 4715 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.869279 4715 server.go:997] "Starting client certificate rotation" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.869325 4715 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.870470 4715 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-06 17:29:51.721782316 +0000 UTC Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.870610 4715 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1401h43m41.851179618s for next certificate rotation Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.918528 4715 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.924470 4715 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.947626 4715 log.go:25] "Validated CRI v1 runtime API" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.990805 4715 log.go:25] "Validated CRI v1 image API" Oct 09 07:46:09 crc kubenswrapper[4715]: I1009 07:46:09.993177 4715 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.003909 4715 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-09-07-42-03-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.003985 4715 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.032336 4715 manager.go:217] Machine: {Timestamp:2025-10-09 07:46:10.02834666 +0000 UTC m=+0.721150688 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:25873b5a-8b59-46be-9c14-6241a2c78490 BootID:88c6bc2d-8227-4dff-bf57-494ec73b39f9 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ae:68:76 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ae:68:76 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:18:86:ab Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:58:08:88 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:bf:46:47 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:df:a9:96 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:22:de:58:2c:ad:98 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:0e:c4:ba:92:29:bd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.032604 4715 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.032826 4715 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.034623 4715 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.034977 4715 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.035034 4715 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.035367 4715 topology_manager.go:138] "Creating topology manager with none policy" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.035387 4715 container_manager_linux.go:303] "Creating device plugin manager" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.036211 4715 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.036300 4715 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.036560 4715 state_mem.go:36] "Initialized new in-memory state store" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.036688 4715 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.041697 4715 kubelet.go:418] "Attempting to sync node with API server" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.041735 4715 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.041831 4715 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.041870 4715 kubelet.go:324] "Adding apiserver pod source" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.041895 4715 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.047044 4715 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.049047 4715 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 09 07:46:10 crc kubenswrapper[4715]: W1009 07:46:10.050995 4715 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.051143 4715 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 09 07:46:10 crc kubenswrapper[4715]: W1009 07:46:10.051121 4715 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.051250 4715 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.051653 4715 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.053825 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.053886 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.053902 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.053918 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.053945 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.053963 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.053980 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.054007 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.054028 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.054046 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.054073 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.054086 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.057975 4715 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.058974 4715 server.go:1280] "Started kubelet" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.060387 4715 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.060361 4715 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 09 07:46:10 crc systemd[1]: Started Kubernetes Kubelet. Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.061995 4715 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.062622 4715 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.064773 4715 server.go:460] "Adding debug handlers to kubelet server" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.064818 4715 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.064869 4715 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.064895 4715 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 04:34:32.790570688 +0000 UTC Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.064958 4715 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1388h48m22.725615899s for next certificate rotation Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.065093 4715 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.065116 4715 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.065283 4715 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.067595 4715 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.068094 4715 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Oct 09 07:46:10 crc kubenswrapper[4715]: W1009 07:46:10.068201 4715 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.068394 4715 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.068223 4715 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186cc302897a70e0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-09 07:46:10.058875104 +0000 UTC m=+0.751679142,LastTimestamp:2025-10-09 07:46:10.058875104 +0000 UTC m=+0.751679142,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.074693 4715 factory.go:55] Registering systemd factory Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.074774 4715 factory.go:221] Registration of the systemd container factory successfully Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.078290 4715 factory.go:153] Registering CRI-O factory Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.078354 4715 factory.go:221] Registration of the crio container factory successfully Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.078548 4715 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.078614 4715 factory.go:103] Registering Raw factory Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.078644 4715 manager.go:1196] Started watching for new ooms in manager Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.079906 4715 manager.go:319] Starting recovery of all containers Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.085589 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.085844 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.085988 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.086121 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.086271 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.086410 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.086591 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.086724 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.086851 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.087000 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.087140 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.087497 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.087702 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.087845 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.087986 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.088116 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.088255 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.088450 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.088612 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.088846 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.088988 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.089130 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.089260 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.089399 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.089582 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.089715 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.089855 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.090346 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.090528 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.090717 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.090854 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.090983 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.091121 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.091280 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092572 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092613 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092627 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092645 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092659 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092672 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092688 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092704 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092716 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092728 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092741 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092753 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092767 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092779 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092793 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092807 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092818 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092830 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092849 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092864 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092879 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092894 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092907 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092921 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092933 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092947 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092959 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092972 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.092993 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093006 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093020 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093033 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093045 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093056 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093068 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093081 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093092 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093104 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093116 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093129 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093141 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093154 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093166 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093178 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093192 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093204 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093216 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093229 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093241 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093273 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093287 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093299 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093311 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093323 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093336 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093350 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093361 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093376 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093388 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093399 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093412 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093439 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093451 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093462 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093473 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093484 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093496 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093510 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093559 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093573 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093592 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093605 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093618 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093633 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093647 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093666 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093680 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093699 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093711 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093724 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093740 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093752 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093765 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093775 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093786 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093797 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093808 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093821 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093832 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093842 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093854 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093867 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093880 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093893 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093903 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093916 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093927 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093938 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093951 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093963 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093977 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.093989 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094001 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094012 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094023 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094048 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094060 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094071 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094085 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094097 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094110 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094123 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094134 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094147 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094159 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094171 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094215 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094227 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094238 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094250 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094262 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094273 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.094288 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097566 4715 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097640 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097670 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097691 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097710 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097731 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097753 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097778 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097798 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097820 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097839 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097863 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097883 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097904 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097923 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097945 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097965 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.097985 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098006 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098025 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098048 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098072 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098094 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098149 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098207 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098235 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098260 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098287 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098313 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098332 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098352 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098374 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098395 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098416 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098491 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098511 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098533 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098554 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098574 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098594 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098615 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098661 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098683 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098703 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098725 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098744 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098763 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098782 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098801 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098820 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098840 4715 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098861 4715 reconstruct.go:97] "Volume reconstruction finished" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.098877 4715 reconciler.go:26] "Reconciler: start to sync state" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.107991 4715 manager.go:324] Recovery completed Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.118002 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.121639 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.121696 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.121708 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.123197 4715 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.123252 4715 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.123300 4715 state_mem.go:36] "Initialized new in-memory state store" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.130012 4715 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.135471 4715 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.135563 4715 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.135617 4715 kubelet.go:2335] "Starting kubelet main sync loop" Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.135708 4715 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.137390 4715 policy_none.go:49] "None policy: Start" Oct 09 07:46:10 crc kubenswrapper[4715]: W1009 07:46:10.137837 4715 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.137952 4715 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.139172 4715 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.139248 4715 state_mem.go:35] "Initializing new in-memory state store" Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.168666 4715 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.194385 4715 manager.go:334] "Starting Device Plugin manager" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.194557 4715 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.194574 4715 server.go:79] "Starting device plugin registration server" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.195081 4715 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.195105 4715 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.195582 4715 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.195736 4715 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.195761 4715 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.206411 4715 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.236896 4715 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.237085 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.239298 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.239353 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.239371 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.239605 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.240139 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.240212 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.240989 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.241035 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.241050 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.241263 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.241467 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.241524 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.242125 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.242159 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.242171 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.242817 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.242862 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.242883 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.243077 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.243399 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.243452 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.243489 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.243509 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.243517 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.244537 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.244579 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.244597 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.244805 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.244947 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.244996 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.245369 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.245474 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.245494 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.247226 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.247277 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.247296 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.248273 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.248325 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.248344 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.248800 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.248860 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.250075 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.250116 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.250135 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.269591 4715 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.295317 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.297358 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.297669 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.297741 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.297829 4715 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.298942 4715 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.302615 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.302673 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.302710 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.302741 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.303034 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.303087 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.303116 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.303148 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.303179 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.303211 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.303241 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.303296 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.303344 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.303371 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.303389 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404588 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404660 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404686 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404708 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404727 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404750 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404774 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404797 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404824 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404845 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404847 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404897 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404925 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404932 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404978 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404992 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.405005 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404986 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404981 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.404864 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.405001 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.405092 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.405114 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.405135 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.405142 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.405156 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.405183 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.405187 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.405302 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.405694 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.499663 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.501694 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.501757 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.501770 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.501802 4715 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.502553 4715 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.592991 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.599217 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.617564 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.638989 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: W1009 07:46:10.639338 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-440b3e396499532c5f83ced9be35ab1c556d88c6842f05c7d6726d7cd5bab286 WatchSource:0}: Error finding container 440b3e396499532c5f83ced9be35ab1c556d88c6842f05c7d6726d7cd5bab286: Status 404 returned error can't find the container with id 440b3e396499532c5f83ced9be35ab1c556d88c6842f05c7d6726d7cd5bab286 Oct 09 07:46:10 crc kubenswrapper[4715]: W1009 07:46:10.640608 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-84814b675ebd394a3143e76306735ff41c90ea46d91a908829ab6188a8f8498b WatchSource:0}: Error finding container 84814b675ebd394a3143e76306735ff41c90ea46d91a908829ab6188a8f8498b: Status 404 returned error can't find the container with id 84814b675ebd394a3143e76306735ff41c90ea46d91a908829ab6188a8f8498b Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.643690 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 07:46:10 crc kubenswrapper[4715]: W1009 07:46:10.645012 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a64a834536ba7ed05d52dba2faf8c9bc5c32fb94a56f5a236d2eb953e0364efe WatchSource:0}: Error finding container a64a834536ba7ed05d52dba2faf8c9bc5c32fb94a56f5a236d2eb953e0364efe: Status 404 returned error can't find the container with id a64a834536ba7ed05d52dba2faf8c9bc5c32fb94a56f5a236d2eb953e0364efe Oct 09 07:46:10 crc kubenswrapper[4715]: W1009 07:46:10.655689 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a9e5266336898bc7547d87d7991fd245d78f1f058612310c30dec9ff57844cc7 WatchSource:0}: Error finding container a9e5266336898bc7547d87d7991fd245d78f1f058612310c30dec9ff57844cc7: Status 404 returned error can't find the container with id a9e5266336898bc7547d87d7991fd245d78f1f058612310c30dec9ff57844cc7 Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.670710 4715 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Oct 09 07:46:10 crc kubenswrapper[4715]: W1009 07:46:10.671149 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-29b21711242ea08d8e2127fec18457bf28155ffdd0f9ba37d7d50c05b5c880e1 WatchSource:0}: Error finding container 29b21711242ea08d8e2127fec18457bf28155ffdd0f9ba37d7d50c05b5c880e1: Status 404 returned error can't find the container with id 29b21711242ea08d8e2127fec18457bf28155ffdd0f9ba37d7d50c05b5c880e1 Oct 09 07:46:10 crc kubenswrapper[4715]: W1009 07:46:10.862557 4715 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.862667 4715 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.903280 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.904941 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.904984 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.904996 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:10 crc kubenswrapper[4715]: I1009 07:46:10.905024 4715 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 07:46:10 crc kubenswrapper[4715]: E1009 07:46:10.905649 4715 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Oct 09 07:46:11 crc kubenswrapper[4715]: W1009 07:46:11.015170 4715 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:11 crc kubenswrapper[4715]: E1009 07:46:11.015303 4715 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 09 07:46:11 crc kubenswrapper[4715]: I1009 07:46:11.062794 4715 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:11 crc kubenswrapper[4715]: I1009 07:46:11.142583 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"440b3e396499532c5f83ced9be35ab1c556d88c6842f05c7d6726d7cd5bab286"} Oct 09 07:46:11 crc kubenswrapper[4715]: I1009 07:46:11.144060 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"29b21711242ea08d8e2127fec18457bf28155ffdd0f9ba37d7d50c05b5c880e1"} Oct 09 07:46:11 crc kubenswrapper[4715]: I1009 07:46:11.149574 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a9e5266336898bc7547d87d7991fd245d78f1f058612310c30dec9ff57844cc7"} Oct 09 07:46:11 crc kubenswrapper[4715]: I1009 07:46:11.152100 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a64a834536ba7ed05d52dba2faf8c9bc5c32fb94a56f5a236d2eb953e0364efe"} Oct 09 07:46:11 crc kubenswrapper[4715]: I1009 07:46:11.153576 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"84814b675ebd394a3143e76306735ff41c90ea46d91a908829ab6188a8f8498b"} Oct 09 07:46:11 crc kubenswrapper[4715]: W1009 07:46:11.312068 4715 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:11 crc kubenswrapper[4715]: E1009 07:46:11.312179 4715 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 09 07:46:11 crc kubenswrapper[4715]: E1009 07:46:11.472796 4715 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Oct 09 07:46:11 crc kubenswrapper[4715]: W1009 07:46:11.512708 4715 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:11 crc kubenswrapper[4715]: E1009 07:46:11.512811 4715 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 09 07:46:11 crc kubenswrapper[4715]: I1009 07:46:11.706664 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:11 crc kubenswrapper[4715]: I1009 07:46:11.708386 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:11 crc kubenswrapper[4715]: I1009 07:46:11.708452 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:11 crc kubenswrapper[4715]: I1009 07:46:11.708463 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:11 crc kubenswrapper[4715]: I1009 07:46:11.708495 4715 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 07:46:11 crc kubenswrapper[4715]: E1009 07:46:11.709084 4715 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.063013 4715 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.159092 4715 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37" exitCode=0 Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.159231 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37"} Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.159344 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.160712 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.160755 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.160768 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.164258 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f"} Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.164394 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.164409 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89"} Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.164554 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887"} Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.164578 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f"} Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.165596 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.165657 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.165685 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.166249 4715 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0" exitCode=0 Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.166354 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0"} Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.166564 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.167760 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.167799 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.167818 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.169336 4715 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30" exitCode=0 Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.169452 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30"} Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.169549 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.170472 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.170807 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.170856 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.170876 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.171735 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.171767 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.171779 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.171929 4715 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695" exitCode=0 Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.172004 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.171981 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695"} Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.173719 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.173746 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.173759 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:12 crc kubenswrapper[4715]: I1009 07:46:12.684961 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.063390 4715 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:13 crc kubenswrapper[4715]: E1009 07:46:13.074565 4715 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.179156 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474"} Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.179223 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463"} Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.179240 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e"} Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.179279 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.180976 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.181050 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.181070 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.183681 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1"} Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.183749 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720"} Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.183771 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98"} Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.183789 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1"} Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.186026 4715 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367" exitCode=0 Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.186119 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367"} Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.186193 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.187278 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.187316 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.187334 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.188235 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"14e99604cffc35c659058d1363536aa5d067bbbb1c29b2b366c6aa8c1ed6bb72"} Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.188283 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.188314 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.189126 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.189156 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.189170 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.189766 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.189823 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.189837 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.309675 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.311121 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.311158 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.311170 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.311199 4715 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 07:46:13 crc kubenswrapper[4715]: E1009 07:46:13.311677 4715 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Oct 09 07:46:13 crc kubenswrapper[4715]: I1009 07:46:13.401467 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:13 crc kubenswrapper[4715]: W1009 07:46:13.403221 4715 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:13 crc kubenswrapper[4715]: E1009 07:46:13.403330 4715 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 09 07:46:13 crc kubenswrapper[4715]: W1009 07:46:13.582555 4715 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Oct 09 07:46:13 crc kubenswrapper[4715]: E1009 07:46:13.582701 4715 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.195226 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ef865bbc5af229ba36fa76c7f51cf4b2579e28a7d2509e7390b581ec35cb72a1"} Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.195473 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.196532 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.196584 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.196605 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.199555 4715 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97" exitCode=0 Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.199664 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.199701 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.200324 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.200870 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97"} Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.200983 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.201661 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.203046 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.203080 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.203090 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.203309 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.203360 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.203383 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.203602 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.203615 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.203623 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.204101 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.204118 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.204130 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.728783 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:14 crc kubenswrapper[4715]: I1009 07:46:14.738741 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.207291 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924"} Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.207363 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff"} Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.207370 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.207383 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92"} Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.207440 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.208143 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.208862 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.208896 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.208910 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.209673 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.209715 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.209732 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.512636 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.686185 4715 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.686296 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 07:46:15 crc kubenswrapper[4715]: I1009 07:46:15.822875 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.216627 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996"} Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.216735 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd"} Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.216760 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.216794 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.216760 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.216864 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.218375 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.218402 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.218452 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.218465 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.218473 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.218484 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.218482 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.218545 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.218562 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.512236 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.514226 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.514285 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.514308 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:16 crc kubenswrapper[4715]: I1009 07:46:16.514351 4715 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.198956 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.219405 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.219405 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.219604 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.220793 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.220843 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.220857 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.220942 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.220963 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.220970 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.221008 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.221023 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:17 crc kubenswrapper[4715]: I1009 07:46:17.221140 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:18 crc kubenswrapper[4715]: I1009 07:46:18.154997 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:18 crc kubenswrapper[4715]: I1009 07:46:18.222844 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:18 crc kubenswrapper[4715]: I1009 07:46:18.223982 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:18 crc kubenswrapper[4715]: I1009 07:46:18.224023 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:18 crc kubenswrapper[4715]: I1009 07:46:18.224035 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:19 crc kubenswrapper[4715]: I1009 07:46:19.549933 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 07:46:19 crc kubenswrapper[4715]: I1009 07:46:19.550361 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:19 crc kubenswrapper[4715]: I1009 07:46:19.552215 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:19 crc kubenswrapper[4715]: I1009 07:46:19.552265 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:19 crc kubenswrapper[4715]: I1009 07:46:19.552277 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:20 crc kubenswrapper[4715]: E1009 07:46:20.206647 4715 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 09 07:46:20 crc kubenswrapper[4715]: I1009 07:46:20.854050 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 09 07:46:20 crc kubenswrapper[4715]: I1009 07:46:20.854257 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:20 crc kubenswrapper[4715]: I1009 07:46:20.855558 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:20 crc kubenswrapper[4715]: I1009 07:46:20.855614 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:20 crc kubenswrapper[4715]: I1009 07:46:20.855637 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:23 crc kubenswrapper[4715]: I1009 07:46:23.453044 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 09 07:46:23 crc kubenswrapper[4715]: I1009 07:46:23.453362 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:23 crc kubenswrapper[4715]: I1009 07:46:23.454998 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:23 crc kubenswrapper[4715]: I1009 07:46:23.455066 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:23 crc kubenswrapper[4715]: I1009 07:46:23.455084 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:24 crc kubenswrapper[4715]: W1009 07:46:24.007927 4715 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 09 07:46:24 crc kubenswrapper[4715]: I1009 07:46:24.008051 4715 trace.go:236] Trace[440044530]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 07:46:14.006) (total time: 10001ms): Oct 09 07:46:24 crc kubenswrapper[4715]: Trace[440044530]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (07:46:24.007) Oct 09 07:46:24 crc kubenswrapper[4715]: Trace[440044530]: [10.001781191s] [10.001781191s] END Oct 09 07:46:24 crc kubenswrapper[4715]: E1009 07:46:24.008087 4715 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 09 07:46:24 crc kubenswrapper[4715]: I1009 07:46:24.063753 4715 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 09 07:46:24 crc kubenswrapper[4715]: W1009 07:46:24.204017 4715 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 09 07:46:24 crc kubenswrapper[4715]: I1009 07:46:24.204139 4715 trace.go:236] Trace[270629564]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 07:46:14.202) (total time: 10001ms): Oct 09 07:46:24 crc kubenswrapper[4715]: Trace[270629564]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (07:46:24.204) Oct 09 07:46:24 crc kubenswrapper[4715]: Trace[270629564]: [10.001868323s] [10.001868323s] END Oct 09 07:46:24 crc kubenswrapper[4715]: E1009 07:46:24.204180 4715 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 09 07:46:24 crc kubenswrapper[4715]: I1009 07:46:24.779393 4715 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 09 07:46:24 crc kubenswrapper[4715]: I1009 07:46:24.779474 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 09 07:46:24 crc kubenswrapper[4715]: I1009 07:46:24.790923 4715 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 09 07:46:24 crc kubenswrapper[4715]: I1009 07:46:24.791005 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 09 07:46:25 crc kubenswrapper[4715]: I1009 07:46:25.186096 4715 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 09 07:46:25 crc kubenswrapper[4715]: I1009 07:46:25.186195 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 09 07:46:25 crc kubenswrapper[4715]: I1009 07:46:25.242758 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 09 07:46:25 crc kubenswrapper[4715]: I1009 07:46:25.245032 4715 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ef865bbc5af229ba36fa76c7f51cf4b2579e28a7d2509e7390b581ec35cb72a1" exitCode=255 Oct 09 07:46:25 crc kubenswrapper[4715]: I1009 07:46:25.245082 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ef865bbc5af229ba36fa76c7f51cf4b2579e28a7d2509e7390b581ec35cb72a1"} Oct 09 07:46:25 crc kubenswrapper[4715]: I1009 07:46:25.245255 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:25 crc kubenswrapper[4715]: I1009 07:46:25.246158 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:25 crc kubenswrapper[4715]: I1009 07:46:25.246197 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:25 crc kubenswrapper[4715]: I1009 07:46:25.246208 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:25 crc kubenswrapper[4715]: I1009 07:46:25.246723 4715 scope.go:117] "RemoveContainer" containerID="ef865bbc5af229ba36fa76c7f51cf4b2579e28a7d2509e7390b581ec35cb72a1" Oct 09 07:46:25 crc kubenswrapper[4715]: I1009 07:46:25.685702 4715 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 09 07:46:25 crc kubenswrapper[4715]: I1009 07:46:25.686120 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 09 07:46:26 crc kubenswrapper[4715]: I1009 07:46:26.249604 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 09 07:46:26 crc kubenswrapper[4715]: I1009 07:46:26.253352 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028"} Oct 09 07:46:26 crc kubenswrapper[4715]: I1009 07:46:26.253592 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:26 crc kubenswrapper[4715]: I1009 07:46:26.255253 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:26 crc kubenswrapper[4715]: I1009 07:46:26.255474 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:26 crc kubenswrapper[4715]: I1009 07:46:26.255633 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.204827 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.205053 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.206537 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.206599 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.206614 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.258265 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.258938 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.261003 4715 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028" exitCode=255 Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.261051 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028"} Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.261113 4715 scope.go:117] "RemoveContainer" containerID="ef865bbc5af229ba36fa76c7f51cf4b2579e28a7d2509e7390b581ec35cb72a1" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.261308 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.262212 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.262264 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.262283 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:27 crc kubenswrapper[4715]: I1009 07:46:27.263093 4715 scope.go:117] "RemoveContainer" containerID="f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028" Oct 09 07:46:27 crc kubenswrapper[4715]: E1009 07:46:27.263397 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 09 07:46:28 crc kubenswrapper[4715]: I1009 07:46:28.160385 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:28 crc kubenswrapper[4715]: I1009 07:46:28.265748 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 09 07:46:28 crc kubenswrapper[4715]: I1009 07:46:28.268956 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:28 crc kubenswrapper[4715]: I1009 07:46:28.270323 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:28 crc kubenswrapper[4715]: I1009 07:46:28.270366 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:28 crc kubenswrapper[4715]: I1009 07:46:28.270383 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:28 crc kubenswrapper[4715]: I1009 07:46:28.271152 4715 scope.go:117] "RemoveContainer" containerID="f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028" Oct 09 07:46:28 crc kubenswrapper[4715]: E1009 07:46:28.271468 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 09 07:46:28 crc kubenswrapper[4715]: I1009 07:46:28.276391 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:29 crc kubenswrapper[4715]: I1009 07:46:29.271510 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:29 crc kubenswrapper[4715]: I1009 07:46:29.272611 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:29 crc kubenswrapper[4715]: I1009 07:46:29.272661 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:29 crc kubenswrapper[4715]: I1009 07:46:29.272676 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:29 crc kubenswrapper[4715]: I1009 07:46:29.273448 4715 scope.go:117] "RemoveContainer" containerID="f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028" Oct 09 07:46:29 crc kubenswrapper[4715]: E1009 07:46:29.273664 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 09 07:46:29 crc kubenswrapper[4715]: E1009 07:46:29.780718 4715 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 09 07:46:29 crc kubenswrapper[4715]: I1009 07:46:29.782567 4715 trace.go:236] Trace[675364092]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 07:46:17.463) (total time: 12318ms): Oct 09 07:46:29 crc kubenswrapper[4715]: Trace[675364092]: ---"Objects listed" error: 12318ms (07:46:29.782) Oct 09 07:46:29 crc kubenswrapper[4715]: Trace[675364092]: [12.318621961s] [12.318621961s] END Oct 09 07:46:29 crc kubenswrapper[4715]: I1009 07:46:29.782832 4715 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 09 07:46:29 crc kubenswrapper[4715]: I1009 07:46:29.782687 4715 trace.go:236] Trace[1037807396]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Oct-2025 07:46:18.581) (total time: 11200ms): Oct 09 07:46:29 crc kubenswrapper[4715]: Trace[1037807396]: ---"Objects listed" error: 11200ms (07:46:29.782) Oct 09 07:46:29 crc kubenswrapper[4715]: Trace[1037807396]: [11.200731708s] [11.200731708s] END Oct 09 07:46:29 crc kubenswrapper[4715]: I1009 07:46:29.782947 4715 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 09 07:46:29 crc kubenswrapper[4715]: E1009 07:46:29.783681 4715 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 09 07:46:29 crc kubenswrapper[4715]: I1009 07:46:29.784794 4715 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 09 07:46:30 crc kubenswrapper[4715]: E1009 07:46:30.207518 4715 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 09 07:46:30 crc kubenswrapper[4715]: I1009 07:46:30.227533 4715 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 09 07:46:30 crc kubenswrapper[4715]: I1009 07:46:30.412720 4715 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.053520 4715 apiserver.go:52] "Watching apiserver" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.061313 4715 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.061836 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.062381 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.062536 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.063047 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.063098 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.063135 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.063181 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.063193 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.063256 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.063279 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.065911 4715 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.066965 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.066985 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.066969 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.066967 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.066970 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.067314 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.068657 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.068882 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.069194 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.092922 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.092966 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.092986 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093002 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093018 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093035 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093049 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093068 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093094 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093111 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093128 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093142 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093157 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093171 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093189 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093203 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093233 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093276 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093294 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093309 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093326 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093342 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093359 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093376 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093394 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093410 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093447 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093470 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093489 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093509 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093191 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093805 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093191 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093827 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093217 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093374 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093845 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093397 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093413 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093520 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093609 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093640 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093765 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093786 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094129 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094131 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094132 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094167 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094314 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094379 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.093860 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094465 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094496 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094525 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094554 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094581 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094606 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094631 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094655 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094679 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094704 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094812 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094840 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094865 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094888 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094912 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094942 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094971 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094996 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095025 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095058 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095086 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095115 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095142 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095168 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095192 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095215 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095239 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095262 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095283 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095306 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095331 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095357 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095415 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095472 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095497 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095522 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095547 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095569 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095612 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095636 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095659 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095685 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095707 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095730 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095755 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095776 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095797 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095819 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095875 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095897 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095935 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095963 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095985 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096005 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096025 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096051 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096073 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096097 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096120 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096143 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094553 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094802 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094949 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094980 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.094994 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095032 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095135 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095183 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.098933 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095196 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095213 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095242 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095260 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095352 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095383 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095464 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095624 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095664 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095699 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095708 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095800 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095833 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095868 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.095913 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096111 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096124 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096141 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.096201 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:46:31.59614969 +0000 UTC m=+22.288953698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099165 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099199 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099241 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099265 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099288 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099313 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099320 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099336 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099360 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099382 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099404 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099453 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099508 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099530 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099555 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099577 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099601 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099625 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099684 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099706 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099728 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099789 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099817 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099841 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099863 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099885 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099909 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099932 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099956 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099994 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100017 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100040 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100062 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100086 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100112 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100134 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100160 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100180 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100203 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100225 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100250 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100273 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100294 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100318 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100340 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100363 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100388 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100412 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100458 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100483 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100507 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100533 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100567 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100590 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100613 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100636 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100659 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100682 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100706 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100732 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100754 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100775 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100799 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100829 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100852 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100874 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100900 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100922 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100967 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100993 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101018 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101073 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101097 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101122 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101146 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101170 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101201 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101227 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101251 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101274 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101300 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101322 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101391 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101422 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101465 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101488 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101513 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101540 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101562 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101590 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101616 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101640 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101665 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101686 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101711 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101734 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101756 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101782 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101808 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101829 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101852 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101877 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101903 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099451 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101951 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101984 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102017 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102042 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102071 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102097 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102124 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102148 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102174 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102200 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102224 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102254 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102281 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102306 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102362 4715 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102381 4715 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102396 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099582 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096325 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096357 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096380 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096398 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096588 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096423 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.097940 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.098457 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.098486 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.098745 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099097 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099595 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099709 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099775 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.099817 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100024 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100054 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100372 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100496 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100499 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100625 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100644 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100772 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100849 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100877 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100884 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.100333 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101131 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101157 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101190 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101409 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101494 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101547 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101598 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.096305 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101736 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101747 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101780 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101867 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.101939 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102063 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102221 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102358 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102364 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.102582 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.106127 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.106518 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.106573 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.106865 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.107105 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.107118 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.107300 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.107866 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.108148 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.108471 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.108660 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.108873 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.109382 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.109624 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.109790 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.109851 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.109946 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.112821 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.112879 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.113237 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.113269 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.113392 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.113398 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.113607 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.113688 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.114086 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.114183 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.114266 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.114306 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.114341 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.114368 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.118078 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.118331 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.123411 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.123624 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.123663 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.123667 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.123830 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.124072 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.120040 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.120402 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.120538 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.120744 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.120796 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.121115 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.122094 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.122369 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.122526 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.122402 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.124109 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.124282 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.124330 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.124487 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.124655 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.124802 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.124849 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.124853 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.124861 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.124944 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.125076 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.125487 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.125525 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.125539 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.125615 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.125654 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.125668 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.125748 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.125819 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.125869 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.119490 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.125950 4715 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.125966 4715 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126016 4715 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.126040 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:31.626024324 +0000 UTC m=+22.318828332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126056 4715 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126078 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126108 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126135 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126133 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126135 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126175 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126176 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126398 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126465 4715 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126527 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126561 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126588 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126608 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126627 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126646 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126663 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126681 4715 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126698 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126739 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126755 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126769 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126783 4715 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126780 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126798 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.126812 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127023 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127156 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127180 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127202 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127224 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127298 4715 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127328 4715 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127349 4715 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127370 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127357 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127397 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127445 4715 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127473 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127493 4715 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127510 4715 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127527 4715 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127545 4715 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127565 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127585 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127605 4715 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127620 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127637 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127655 4715 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127671 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127687 4715 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.127810 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:31.627780833 +0000 UTC m=+22.320584891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127872 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127921 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.127978 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.128036 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.128088 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.128143 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.128261 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.128329 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.128443 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.128450 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.128663 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.128706 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.129030 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.129056 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.129601 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.130560 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.136670 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.137901 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.137963 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.142532 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.143175 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.146614 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.153737 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.156742 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.159715 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.159925 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.159940 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.159953 4715 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.159997 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.160005 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:31.659989021 +0000 UTC m=+22.352793029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.172918 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.173533 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.173806 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.173895 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.173921 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.173935 4715 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.173993 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:31.673976042 +0000 UTC m=+22.366780060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.191993 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.193343 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.197872 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.206489 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.213789 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.225195 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.228515 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.228654 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.228737 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.228784 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.228891 4715 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229310 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229542 4715 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229569 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229588 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229600 4715 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229612 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229625 4715 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229637 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229656 4715 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229667 4715 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229678 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229690 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229702 4715 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229714 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229726 4715 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229740 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229752 4715 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229765 4715 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229776 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229788 4715 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229799 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229813 4715 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229846 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229873 4715 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229886 4715 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229898 4715 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229910 4715 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229920 4715 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229945 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229956 4715 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229967 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229981 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.229993 4715 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230005 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230016 4715 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230027 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230038 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230049 4715 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230061 4715 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230072 4715 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230083 4715 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230094 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230106 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230117 4715 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230128 4715 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230140 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230152 4715 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230165 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230176 4715 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230187 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230199 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230211 4715 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230223 4715 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230233 4715 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230244 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230255 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230266 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230277 4715 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230288 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230299 4715 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230311 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230322 4715 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230333 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230344 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230355 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230368 4715 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230380 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230393 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230404 4715 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230419 4715 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230452 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230463 4715 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230475 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230488 4715 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230499 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230510 4715 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230527 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230539 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230550 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230562 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230574 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230587 4715 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230597 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230608 4715 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230619 4715 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230630 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230640 4715 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230655 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230666 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230677 4715 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230688 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230699 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230710 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230722 4715 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230734 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230745 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230756 4715 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230767 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230779 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230789 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230801 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230812 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230823 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230835 4715 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230845 4715 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230856 4715 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230867 4715 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230879 4715 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230892 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230903 4715 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230914 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230925 4715 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230938 4715 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230948 4715 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230959 4715 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230972 4715 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230983 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.230995 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231006 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231016 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231027 4715 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231038 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231049 4715 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231061 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231072 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231082 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231102 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231114 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231125 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231137 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231148 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231160 4715 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231172 4715 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231186 4715 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231197 4715 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231215 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231227 4715 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231240 4715 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231251 4715 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231262 4715 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231273 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231283 4715 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231294 4715 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231305 4715 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231316 4715 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231327 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231339 4715 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231350 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.231361 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.238084 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.267852 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5tfxq"] Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.268200 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5tfxq" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.270616 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.270645 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.270851 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.295820 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.325877 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.331813 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdktp\" (UniqueName: \"kubernetes.io/projected/a186a549-1c86-4777-97e8-04df48fad842-kube-api-access-mdktp\") pod \"node-resolver-5tfxq\" (UID: \"a186a549-1c86-4777-97e8-04df48fad842\") " pod="openshift-dns/node-resolver-5tfxq" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.331857 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a186a549-1c86-4777-97e8-04df48fad842-hosts-file\") pod \"node-resolver-5tfxq\" (UID: \"a186a549-1c86-4777-97e8-04df48fad842\") " pod="openshift-dns/node-resolver-5tfxq" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.342155 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.352623 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.364874 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.372917 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.381067 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.389064 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.400242 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 09 07:46:31 crc kubenswrapper[4715]: W1009 07:46:31.413359 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-8a136036edc02eb4c8e1efb1d1d823b0e23d96d2864c178c4d81394ab2535bd2 WatchSource:0}: Error finding container 8a136036edc02eb4c8e1efb1d1d823b0e23d96d2864c178c4d81394ab2535bd2: Status 404 returned error can't find the container with id 8a136036edc02eb4c8e1efb1d1d823b0e23d96d2864c178c4d81394ab2535bd2 Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.421845 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.432173 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdktp\" (UniqueName: \"kubernetes.io/projected/a186a549-1c86-4777-97e8-04df48fad842-kube-api-access-mdktp\") pod \"node-resolver-5tfxq\" (UID: \"a186a549-1c86-4777-97e8-04df48fad842\") " pod="openshift-dns/node-resolver-5tfxq" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.432222 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a186a549-1c86-4777-97e8-04df48fad842-hosts-file\") pod \"node-resolver-5tfxq\" (UID: \"a186a549-1c86-4777-97e8-04df48fad842\") " pod="openshift-dns/node-resolver-5tfxq" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.432298 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a186a549-1c86-4777-97e8-04df48fad842-hosts-file\") pod \"node-resolver-5tfxq\" (UID: \"a186a549-1c86-4777-97e8-04df48fad842\") " pod="openshift-dns/node-resolver-5tfxq" Oct 09 07:46:31 crc kubenswrapper[4715]: W1009 07:46:31.432667 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-3922d23aa9f361338e4ebe207d47a776f6791c11f49c208f96548a5befeb8eb3 WatchSource:0}: Error finding container 3922d23aa9f361338e4ebe207d47a776f6791c11f49c208f96548a5befeb8eb3: Status 404 returned error can't find the container with id 3922d23aa9f361338e4ebe207d47a776f6791c11f49c208f96548a5befeb8eb3 Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.449899 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdktp\" (UniqueName: \"kubernetes.io/projected/a186a549-1c86-4777-97e8-04df48fad842-kube-api-access-mdktp\") pod \"node-resolver-5tfxq\" (UID: \"a186a549-1c86-4777-97e8-04df48fad842\") " pod="openshift-dns/node-resolver-5tfxq" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.580620 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5tfxq" Oct 09 07:46:31 crc kubenswrapper[4715]: W1009 07:46:31.591357 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda186a549_1c86_4777_97e8_04df48fad842.slice/crio-347ef04a3cac47d113afab677a56ce4fd8a82b23343140ed9eba5da027c75e29 WatchSource:0}: Error finding container 347ef04a3cac47d113afab677a56ce4fd8a82b23343140ed9eba5da027c75e29: Status 404 returned error can't find the container with id 347ef04a3cac47d113afab677a56ce4fd8a82b23343140ed9eba5da027c75e29 Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.633739 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.633811 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.633837 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.633977 4715 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.634017 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:32.634002997 +0000 UTC m=+23.326807005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.634060 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:46:32.634054299 +0000 UTC m=+23.326858307 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.634108 4715 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.634127 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:32.634122141 +0000 UTC m=+23.326926149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.637857 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-k7vwx"] Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.638315 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.639527 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6vp75"] Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.639996 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.640344 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.640498 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.644062 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.644141 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.644207 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.644368 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.645037 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.645502 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.645601 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-8gf4x"] Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.646223 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z9ztn"] Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.649235 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.651291 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.652687 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.653138 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.655970 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.663316 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.672210 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.673839 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.673104 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.673253 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.673312 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.673555 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.673574 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.673624 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.693840 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.706216 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.717866 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.728145 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735064 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-systemd-units\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735103 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-log-socket\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735126 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/acafd807-8875-4b4f-aba9-4f807ca336e7-proxy-tls\") pod \"machine-config-daemon-k7vwx\" (UID: \"acafd807-8875-4b4f-aba9-4f807ca336e7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735156 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4tdb\" (UniqueName: \"kubernetes.io/projected/1d6cb14a-7329-4a80-aff2-acd9142558d3-kube-api-access-k4tdb\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735234 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-hostroot\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735280 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-var-lib-cni-multus\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735348 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-node-log\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735376 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-cnibin\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735395 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76f34f31-285e-4f90-954d-888a59ad6080-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735411 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2szk\" (UniqueName: \"kubernetes.io/projected/76f34f31-285e-4f90-954d-888a59ad6080-kube-api-access-v2szk\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735458 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735484 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-slash\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735499 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-ovn\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735515 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-cni-bin\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735536 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-etc-openvswitch\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735552 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-system-cni-dir\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735585 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovnkube-script-lib\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735601 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-var-lib-kubelet\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735642 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735678 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-os-release\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735692 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-multus-daemon-config\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735712 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-multus-conf-dir\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735746 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-kubelet\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735767 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-systemd\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735782 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-env-overrides\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.735794 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.735817 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.735833 4715 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735802 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovn-node-metrics-cert\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735854 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-var-lib-cni-bin\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.735871 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-run-multus-certs\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.735914 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:32.7358876 +0000 UTC m=+23.428691608 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736065 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-etc-kubernetes\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736105 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-run-netns\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736124 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-cni-netd\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736145 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/acafd807-8875-4b4f-aba9-4f807ca336e7-rootfs\") pod \"machine-config-daemon-k7vwx\" (UID: \"acafd807-8875-4b4f-aba9-4f807ca336e7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736162 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w6mp\" (UniqueName: \"kubernetes.io/projected/acafd807-8875-4b4f-aba9-4f807ca336e7-kube-api-access-4w6mp\") pod \"machine-config-daemon-k7vwx\" (UID: \"acafd807-8875-4b4f-aba9-4f807ca336e7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736180 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736195 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76f34f31-285e-4f90-954d-888a59ad6080-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736231 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-openvswitch\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736251 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovnkube-config\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736278 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-multus-socket-dir-parent\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736303 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76f34f31-285e-4f90-954d-888a59ad6080-system-cni-dir\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736342 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/acafd807-8875-4b4f-aba9-4f807ca336e7-mcd-auth-proxy-config\") pod \"machine-config-daemon-k7vwx\" (UID: \"acafd807-8875-4b4f-aba9-4f807ca336e7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736363 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-run-k8s-cni-cncf-io\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736381 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-run-netns\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736410 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76f34f31-285e-4f90-954d-888a59ad6080-os-release\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736468 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736502 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-var-lib-openvswitch\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736519 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-multus-cni-dir\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736541 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-cni-binary-copy\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736568 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz46q\" (UniqueName: \"kubernetes.io/projected/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-kube-api-access-zz46q\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736591 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76f34f31-285e-4f90-954d-888a59ad6080-cnibin\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.736612 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76f34f31-285e-4f90-954d-888a59ad6080-cni-binary-copy\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.736701 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.736711 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.736720 4715 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:31 crc kubenswrapper[4715]: E1009 07:46:31.736743 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:32.736735614 +0000 UTC m=+23.429539622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.738758 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.749302 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.758599 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.770885 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.782315 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.801713 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.811416 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.822249 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.837969 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/acafd807-8875-4b4f-aba9-4f807ca336e7-mcd-auth-proxy-config\") pod \"machine-config-daemon-k7vwx\" (UID: \"acafd807-8875-4b4f-aba9-4f807ca336e7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838021 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-run-k8s-cni-cncf-io\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838059 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-run-netns\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838085 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76f34f31-285e-4f90-954d-888a59ad6080-cnibin\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838106 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76f34f31-285e-4f90-954d-888a59ad6080-os-release\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838132 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-var-lib-openvswitch\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838150 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-multus-cni-dir\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838155 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-run-k8s-cni-cncf-io\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838254 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838172 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-cni-binary-copy\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838367 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz46q\" (UniqueName: \"kubernetes.io/projected/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-kube-api-access-zz46q\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838366 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-var-lib-openvswitch\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838226 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-run-netns\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838255 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/76f34f31-285e-4f90-954d-888a59ad6080-cnibin\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838400 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/76f34f31-285e-4f90-954d-888a59ad6080-os-release\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838574 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-multus-cni-dir\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838853 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/acafd807-8875-4b4f-aba9-4f807ca336e7-mcd-auth-proxy-config\") pod \"machine-config-daemon-k7vwx\" (UID: \"acafd807-8875-4b4f-aba9-4f807ca336e7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.838897 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-cni-binary-copy\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839018 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76f34f31-285e-4f90-954d-888a59ad6080-cni-binary-copy\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839065 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/76f34f31-285e-4f90-954d-888a59ad6080-cni-binary-copy\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839115 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-systemd-units\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839191 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-systemd-units\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839142 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-log-socket\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839296 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-log-socket\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839300 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/acafd807-8875-4b4f-aba9-4f807ca336e7-proxy-tls\") pod \"machine-config-daemon-k7vwx\" (UID: \"acafd807-8875-4b4f-aba9-4f807ca336e7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839341 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4tdb\" (UniqueName: \"kubernetes.io/projected/1d6cb14a-7329-4a80-aff2-acd9142558d3-kube-api-access-k4tdb\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839361 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-hostroot\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839379 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-cnibin\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839399 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-var-lib-cni-multus\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839435 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-node-log\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839454 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76f34f31-285e-4f90-954d-888a59ad6080-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839458 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-hostroot\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839473 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2szk\" (UniqueName: \"kubernetes.io/projected/76f34f31-285e-4f90-954d-888a59ad6080-kube-api-access-v2szk\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839497 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839539 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-slash\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839560 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-ovn\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839584 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-node-log\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839601 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-cni-bin\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839641 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-etc-openvswitch\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839645 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-cnibin\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839665 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-system-cni-dir\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839680 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-ovn\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839692 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-cni-bin\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839679 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-slash\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839687 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovnkube-script-lib\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839724 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-var-lib-cni-multus\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839732 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839753 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-var-lib-kubelet\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839785 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-var-lib-kubelet\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839756 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-system-cni-dir\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839812 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-os-release\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839940 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-multus-daemon-config\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839892 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-os-release\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839955 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-etc-openvswitch\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839966 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-var-lib-cni-bin\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.839991 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-var-lib-cni-bin\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.840217 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-multus-conf-dir\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.840269 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-multus-conf-dir\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.840306 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-kubelet\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.840391 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-kubelet\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.840403 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovnkube-script-lib\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.840460 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-systemd\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.840506 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-env-overrides\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.840537 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovn-node-metrics-cert\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.840463 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-systemd\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.840653 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-multus-daemon-config\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.840733 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-run-multus-certs\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.840785 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/76f34f31-285e-4f90-954d-888a59ad6080-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841176 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-env-overrides\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841212 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-host-run-multus-certs\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841257 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-etc-kubernetes\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841304 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-run-netns\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841329 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-cni-netd\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841404 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/acafd807-8875-4b4f-aba9-4f807ca336e7-rootfs\") pod \"machine-config-daemon-k7vwx\" (UID: \"acafd807-8875-4b4f-aba9-4f807ca336e7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841452 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w6mp\" (UniqueName: \"kubernetes.io/projected/acafd807-8875-4b4f-aba9-4f807ca336e7-kube-api-access-4w6mp\") pod \"machine-config-daemon-k7vwx\" (UID: \"acafd807-8875-4b4f-aba9-4f807ca336e7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841475 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841502 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76f34f31-285e-4f90-954d-888a59ad6080-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841527 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-openvswitch\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841550 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovnkube-config\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841573 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-multus-socket-dir-parent\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841581 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/acafd807-8875-4b4f-aba9-4f807ca336e7-rootfs\") pod \"machine-config-daemon-k7vwx\" (UID: \"acafd807-8875-4b4f-aba9-4f807ca336e7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841596 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76f34f31-285e-4f90-954d-888a59ad6080-system-cni-dir\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841620 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-etc-kubernetes\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841684 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-cni-netd\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841686 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-multus-socket-dir-parent\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.841685 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-openvswitch\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.842023 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/76f34f31-285e-4f90-954d-888a59ad6080-system-cni-dir\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.842081 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.842119 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-run-netns\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.842699 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovnkube-config\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.842802 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/76f34f31-285e-4f90-954d-888a59ad6080-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.844763 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/acafd807-8875-4b4f-aba9-4f807ca336e7-proxy-tls\") pod \"machine-config-daemon-k7vwx\" (UID: \"acafd807-8875-4b4f-aba9-4f807ca336e7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.844778 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovn-node-metrics-cert\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.858074 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.860392 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz46q\" (UniqueName: \"kubernetes.io/projected/6e61f2cb-cd6d-46d6-bbb6-dd99919b893d-kube-api-access-zz46q\") pod \"multus-6vp75\" (UID: \"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\") " pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.871127 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4tdb\" (UniqueName: \"kubernetes.io/projected/1d6cb14a-7329-4a80-aff2-acd9142558d3-kube-api-access-k4tdb\") pod \"ovnkube-node-z9ztn\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.873759 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w6mp\" (UniqueName: \"kubernetes.io/projected/acafd807-8875-4b4f-aba9-4f807ca336e7-kube-api-access-4w6mp\") pod \"machine-config-daemon-k7vwx\" (UID: \"acafd807-8875-4b4f-aba9-4f807ca336e7\") " pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.873886 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.881456 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2szk\" (UniqueName: \"kubernetes.io/projected/76f34f31-285e-4f90-954d-888a59ad6080-kube-api-access-v2szk\") pod \"multus-additional-cni-plugins-8gf4x\" (UID: \"76f34f31-285e-4f90-954d-888a59ad6080\") " pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.887953 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.905515 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.918472 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.976946 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.987015 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6vp75" Oct 09 07:46:31 crc kubenswrapper[4715]: W1009 07:46:31.987362 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacafd807_8875_4b4f_aba9_4f807ca336e7.slice/crio-50d144b2af0c7b928ab8a8d8bedb1319940fc1048d3cd2f7543d0d5aa338b200 WatchSource:0}: Error finding container 50d144b2af0c7b928ab8a8d8bedb1319940fc1048d3cd2f7543d0d5aa338b200: Status 404 returned error can't find the container with id 50d144b2af0c7b928ab8a8d8bedb1319940fc1048d3cd2f7543d0d5aa338b200 Oct 09 07:46:31 crc kubenswrapper[4715]: I1009 07:46:31.995252 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" Oct 09 07:46:31 crc kubenswrapper[4715]: W1009 07:46:31.997833 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e61f2cb_cd6d_46d6_bbb6_dd99919b893d.slice/crio-9c1562ed472e0eeb191c064a5cd42965f0ea2bac207c48f16c5d6226d09a3351 WatchSource:0}: Error finding container 9c1562ed472e0eeb191c064a5cd42965f0ea2bac207c48f16c5d6226d09a3351: Status 404 returned error can't find the container with id 9c1562ed472e0eeb191c064a5cd42965f0ea2bac207c48f16c5d6226d09a3351 Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.001622 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:32 crc kubenswrapper[4715]: W1009 07:46:32.016048 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f34f31_285e_4f90_954d_888a59ad6080.slice/crio-ace2714b8abacbe31505492b363eec95a42fef97e50f8cbdd3d609fd7090cab9 WatchSource:0}: Error finding container ace2714b8abacbe31505492b363eec95a42fef97e50f8cbdd3d609fd7090cab9: Status 404 returned error can't find the container with id ace2714b8abacbe31505492b363eec95a42fef97e50f8cbdd3d609fd7090cab9 Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.148524 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.149635 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.151340 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.152222 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.153633 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.154523 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.155308 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.157179 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.158036 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.159294 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.160087 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.161538 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.162256 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.162974 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.164599 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.165552 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.167022 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.167641 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.168608 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.170039 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.170575 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.171590 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.172043 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.173109 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.174436 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.175949 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.178112 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.178784 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.180074 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.180720 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.181768 4715 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.181894 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.184544 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.185249 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.186277 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.187932 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.188601 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.189542 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.190161 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.191502 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.192030 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.193175 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.193852 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.194925 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.195369 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.196265 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.196812 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.197866 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.198307 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.199135 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.199603 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.200571 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.201206 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.201668 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.282275 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6vp75" event={"ID":"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d","Type":"ContainerStarted","Data":"d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.282340 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6vp75" event={"ID":"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d","Type":"ContainerStarted","Data":"9c1562ed472e0eeb191c064a5cd42965f0ea2bac207c48f16c5d6226d09a3351"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.284112 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3922d23aa9f361338e4ebe207d47a776f6791c11f49c208f96548a5befeb8eb3"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.285508 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.285554 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2bd73e702d332f62730ccb28f30003e68d37587bba7c760fb594b0bc63c648f0"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.287941 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3" exitCode=0 Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.288066 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.288115 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"21634b6b9a5f51a41516485e45af2f2a5df4f2c3bcbcbe10016df9e5bad53916"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.290464 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5tfxq" event={"ID":"a186a549-1c86-4777-97e8-04df48fad842","Type":"ContainerStarted","Data":"1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.290512 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5tfxq" event={"ID":"a186a549-1c86-4777-97e8-04df48fad842","Type":"ContainerStarted","Data":"347ef04a3cac47d113afab677a56ce4fd8a82b23343140ed9eba5da027c75e29"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.293293 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.293351 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.293366 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8a136036edc02eb4c8e1efb1d1d823b0e23d96d2864c178c4d81394ab2535bd2"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.296562 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.297212 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" event={"ID":"76f34f31-285e-4f90-954d-888a59ad6080","Type":"ContainerStarted","Data":"ace2714b8abacbe31505492b363eec95a42fef97e50f8cbdd3d609fd7090cab9"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.305135 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.305202 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"50d144b2af0c7b928ab8a8d8bedb1319940fc1048d3cd2f7543d0d5aa338b200"} Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.313705 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.330857 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.343528 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.374068 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.392829 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.421893 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.435311 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.451101 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.469643 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.494469 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.507743 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.523546 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.534991 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.553337 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.578846 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.596598 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.617349 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.632156 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.645844 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.649431 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.649568 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.649591 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.649664 4715 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.649708 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:34.649695509 +0000 UTC m=+25.342499517 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.649997 4715 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.650026 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:34.650018758 +0000 UTC m=+25.342822766 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.650134 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:46:34.650105691 +0000 UTC m=+25.342909699 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.664192 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.678155 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.690746 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.702266 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.703450 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.710289 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.730355 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.750774 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.750827 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.750942 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.750958 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.750970 4715 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.751010 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:34.750997066 +0000 UTC m=+25.443801074 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.751319 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.751351 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.751360 4715 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:32 crc kubenswrapper[4715]: E1009 07:46:32.751384 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:34.751377347 +0000 UTC m=+25.444181355 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.751840 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.772253 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.792552 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.816039 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.836832 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.853176 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.868310 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.885004 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.900408 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.922200 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.937874 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.956190 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.970301 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.984639 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:32 crc kubenswrapper[4715]: I1009 07:46:32.996904 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:32Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.010929 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.025776 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.040619 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.054106 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.066563 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.078460 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.136394 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.136450 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.136541 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:33 crc kubenswrapper[4715]: E1009 07:46:33.136543 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:33 crc kubenswrapper[4715]: E1009 07:46:33.136677 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:33 crc kubenswrapper[4715]: E1009 07:46:33.136761 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.312485 4715 generic.go:334] "Generic (PLEG): container finished" podID="76f34f31-285e-4f90-954d-888a59ad6080" containerID="d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2" exitCode=0 Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.312546 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" event={"ID":"76f34f31-285e-4f90-954d-888a59ad6080","Type":"ContainerDied","Data":"d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2"} Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.316885 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa"} Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.327383 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff"} Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.327462 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4"} Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.327477 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b"} Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.327490 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb"} Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.327501 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab"} Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.333549 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.353579 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.379787 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.408666 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.431390 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.464812 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.486920 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.490580 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.491524 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-pqt86"] Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.491992 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pqt86" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.493773 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.493837 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.493942 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.502840 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.505960 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.506450 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.519818 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.539197 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.553884 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.583406 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.623886 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.660445 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c54c0f2-0671-4f29-a4b8-7ea32758200c-host\") pod \"node-ca-pqt86\" (UID: \"8c54c0f2-0671-4f29-a4b8-7ea32758200c\") " pod="openshift-image-registry/node-ca-pqt86" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.660490 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c54c0f2-0671-4f29-a4b8-7ea32758200c-serviceca\") pod \"node-ca-pqt86\" (UID: \"8c54c0f2-0671-4f29-a4b8-7ea32758200c\") " pod="openshift-image-registry/node-ca-pqt86" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.660550 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkfzn\" (UniqueName: \"kubernetes.io/projected/8c54c0f2-0671-4f29-a4b8-7ea32758200c-kube-api-access-zkfzn\") pod \"node-ca-pqt86\" (UID: \"8c54c0f2-0671-4f29-a4b8-7ea32758200c\") " pod="openshift-image-registry/node-ca-pqt86" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.666886 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.708639 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.745702 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.762085 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c54c0f2-0671-4f29-a4b8-7ea32758200c-host\") pod \"node-ca-pqt86\" (UID: \"8c54c0f2-0671-4f29-a4b8-7ea32758200c\") " pod="openshift-image-registry/node-ca-pqt86" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.762129 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c54c0f2-0671-4f29-a4b8-7ea32758200c-serviceca\") pod \"node-ca-pqt86\" (UID: \"8c54c0f2-0671-4f29-a4b8-7ea32758200c\") " pod="openshift-image-registry/node-ca-pqt86" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.762183 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkfzn\" (UniqueName: \"kubernetes.io/projected/8c54c0f2-0671-4f29-a4b8-7ea32758200c-kube-api-access-zkfzn\") pod \"node-ca-pqt86\" (UID: \"8c54c0f2-0671-4f29-a4b8-7ea32758200c\") " pod="openshift-image-registry/node-ca-pqt86" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.762221 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c54c0f2-0671-4f29-a4b8-7ea32758200c-host\") pod \"node-ca-pqt86\" (UID: \"8c54c0f2-0671-4f29-a4b8-7ea32758200c\") " pod="openshift-image-registry/node-ca-pqt86" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.763874 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c54c0f2-0671-4f29-a4b8-7ea32758200c-serviceca\") pod \"node-ca-pqt86\" (UID: \"8c54c0f2-0671-4f29-a4b8-7ea32758200c\") " pod="openshift-image-registry/node-ca-pqt86" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.794940 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.818931 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkfzn\" (UniqueName: \"kubernetes.io/projected/8c54c0f2-0671-4f29-a4b8-7ea32758200c-kube-api-access-zkfzn\") pod \"node-ca-pqt86\" (UID: \"8c54c0f2-0671-4f29-a4b8-7ea32758200c\") " pod="openshift-image-registry/node-ca-pqt86" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.850772 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.884718 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.924678 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.961853 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pqt86" Oct 09 07:46:33 crc kubenswrapper[4715]: I1009 07:46:33.966793 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:33Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.002764 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.052599 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.088228 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.126403 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.163452 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.206709 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.332610 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" event={"ID":"76f34f31-285e-4f90-954d-888a59ad6080","Type":"ContainerStarted","Data":"30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49"} Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.334479 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e"} Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.338989 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5"} Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.341051 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pqt86" event={"ID":"8c54c0f2-0671-4f29-a4b8-7ea32758200c","Type":"ContainerStarted","Data":"a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f"} Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.341131 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pqt86" event={"ID":"8c54c0f2-0671-4f29-a4b8-7ea32758200c","Type":"ContainerStarted","Data":"9162a349b0fd935433c766ea9cc5eff6ad601a2f1b6eef8cf60acc19112c7634"} Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.346625 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.362545 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.376226 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.392121 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.405525 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.447000 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.484531 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.525677 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.563673 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.612635 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.647911 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.671894 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.672014 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.672039 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.672075 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:46:38.672046411 +0000 UTC m=+29.364850419 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.672149 4715 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.672203 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:38.672192275 +0000 UTC m=+29.364996283 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.672279 4715 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.672388 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:38.67236304 +0000 UTC m=+29.365167048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.693173 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.727052 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.765930 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.773435 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.773522 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.773638 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.773686 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.773694 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.773707 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.773718 4715 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.773723 4715 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.773803 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:38.77377621 +0000 UTC m=+29.466580218 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:34 crc kubenswrapper[4715]: E1009 07:46:34.773840 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:38.773828561 +0000 UTC m=+29.466632769 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.805571 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.843521 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.887841 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.926143 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:34 crc kubenswrapper[4715]: I1009 07:46:34.965477 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.017753 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.051023 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.083660 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.123080 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.136768 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.136823 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.136768 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:35 crc kubenswrapper[4715]: E1009 07:46:35.136896 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:35 crc kubenswrapper[4715]: E1009 07:46:35.136972 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:35 crc kubenswrapper[4715]: E1009 07:46:35.137016 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.172512 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.186014 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.204990 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.223394 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.223719 4715 scope.go:117] "RemoveContainer" containerID="f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028" Oct 09 07:46:35 crc kubenswrapper[4715]: E1009 07:46:35.223985 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.265099 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.309359 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.346220 4715 generic.go:334] "Generic (PLEG): container finished" podID="76f34f31-285e-4f90-954d-888a59ad6080" containerID="30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49" exitCode=0 Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.346268 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" event={"ID":"76f34f31-285e-4f90-954d-888a59ad6080","Type":"ContainerDied","Data":"30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49"} Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.347381 4715 scope.go:117] "RemoveContainer" containerID="f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028" Oct 09 07:46:35 crc kubenswrapper[4715]: E1009 07:46:35.347553 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.347923 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.384870 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.423933 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.474848 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.508318 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.513303 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.547762 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.585523 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.627454 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.665692 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.706331 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.747060 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.800535 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.827123 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.865936 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.909905 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:35 crc kubenswrapper[4715]: I1009 07:46:35.945623 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.184126 4715 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.187038 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.187085 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.187095 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.187218 4715 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.196356 4715 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.196779 4715 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.198113 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.198165 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.198178 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.198197 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.198212 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:36Z","lastTransitionTime":"2025-10-09T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:36 crc kubenswrapper[4715]: E1009 07:46:36.214297 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.218917 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.218949 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.218958 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.218973 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.218984 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:36Z","lastTransitionTime":"2025-10-09T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:36 crc kubenswrapper[4715]: E1009 07:46:36.232521 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.237942 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.237989 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.238001 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.238021 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.238034 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:36Z","lastTransitionTime":"2025-10-09T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:36 crc kubenswrapper[4715]: E1009 07:46:36.252157 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.256563 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.256618 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.256631 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.256651 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.256662 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:36Z","lastTransitionTime":"2025-10-09T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:36 crc kubenswrapper[4715]: E1009 07:46:36.269400 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.273970 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.274022 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.274036 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.274058 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.274073 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:36Z","lastTransitionTime":"2025-10-09T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:36 crc kubenswrapper[4715]: E1009 07:46:36.288648 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: E1009 07:46:36.288786 4715 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.291112 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.291158 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.291173 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.291200 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.291216 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:36Z","lastTransitionTime":"2025-10-09T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.352315 4715 generic.go:334] "Generic (PLEG): container finished" podID="76f34f31-285e-4f90-954d-888a59ad6080" containerID="3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3" exitCode=0 Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.352386 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" event={"ID":"76f34f31-285e-4f90-954d-888a59ad6080","Type":"ContainerDied","Data":"3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3"} Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.358980 4715 scope.go:117] "RemoveContainer" containerID="f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028" Oct 09 07:46:36 crc kubenswrapper[4715]: E1009 07:46:36.359138 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.359185 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2"} Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.365832 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.381144 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.394085 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.394132 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.394141 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.394159 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.394170 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:36Z","lastTransitionTime":"2025-10-09T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.394873 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.417835 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.433303 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.447559 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.462260 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.474397 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.488393 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.497623 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.497674 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.497687 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.497711 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.497723 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:36Z","lastTransitionTime":"2025-10-09T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.505450 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.525564 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.541916 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.553728 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.566008 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.582899 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.600272 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.600307 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.600318 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.600334 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.600345 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:36Z","lastTransitionTime":"2025-10-09T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.702398 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.702456 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.702469 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.702486 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.702498 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:36Z","lastTransitionTime":"2025-10-09T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.805106 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.805149 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.805164 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.805185 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.805200 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:36Z","lastTransitionTime":"2025-10-09T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.907934 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.907980 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.907990 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.908004 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:36 crc kubenswrapper[4715]: I1009 07:46:36.908017 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:36Z","lastTransitionTime":"2025-10-09T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.011246 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.011310 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.011321 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.011340 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.011353 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:37Z","lastTransitionTime":"2025-10-09T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.114172 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.114263 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.114281 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.114312 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.114333 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:37Z","lastTransitionTime":"2025-10-09T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.136616 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.136721 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.136624 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:37 crc kubenswrapper[4715]: E1009 07:46:37.136860 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:37 crc kubenswrapper[4715]: E1009 07:46:37.137086 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:37 crc kubenswrapper[4715]: E1009 07:46:37.137190 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.216543 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.216596 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.216606 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.216622 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.216636 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:37Z","lastTransitionTime":"2025-10-09T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.319300 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.319387 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.319402 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.319449 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.319463 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:37Z","lastTransitionTime":"2025-10-09T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.365473 4715 generic.go:334] "Generic (PLEG): container finished" podID="76f34f31-285e-4f90-954d-888a59ad6080" containerID="d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c" exitCode=0 Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.365538 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" event={"ID":"76f34f31-285e-4f90-954d-888a59ad6080","Type":"ContainerDied","Data":"d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c"} Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.381690 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.398275 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.423076 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.423124 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.423134 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.423151 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.423161 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:37Z","lastTransitionTime":"2025-10-09T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.423181 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.450675 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.473357 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.490474 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.506063 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.515387 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.525500 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.525538 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.525575 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.525596 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.525610 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:37Z","lastTransitionTime":"2025-10-09T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.527413 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.539708 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.557013 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.573995 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.589269 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.601160 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.612395 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:37Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.628961 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.629014 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.629024 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.629041 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.629050 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:37Z","lastTransitionTime":"2025-10-09T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.731652 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.731698 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.731710 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.731729 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.731744 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:37Z","lastTransitionTime":"2025-10-09T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.835619 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.835664 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.835676 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.835696 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.835709 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:37Z","lastTransitionTime":"2025-10-09T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.938671 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.938729 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.938738 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.938758 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:37 crc kubenswrapper[4715]: I1009 07:46:37.938769 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:37Z","lastTransitionTime":"2025-10-09T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.041647 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.041697 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.041712 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.041731 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.041740 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:38Z","lastTransitionTime":"2025-10-09T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.144054 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.144127 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.144153 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.144181 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.144207 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:38Z","lastTransitionTime":"2025-10-09T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.247773 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.247809 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.247819 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.247840 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.247852 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:38Z","lastTransitionTime":"2025-10-09T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.350211 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.350249 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.350258 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.350274 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.350287 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:38Z","lastTransitionTime":"2025-10-09T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.372937 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" event={"ID":"76f34f31-285e-4f90-954d-888a59ad6080","Type":"ContainerStarted","Data":"7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0"} Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.379374 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c"} Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.380075 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.380140 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.393709 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.416582 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.434617 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.453795 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.453849 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.453862 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.453887 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.453904 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:38Z","lastTransitionTime":"2025-10-09T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.456409 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.471857 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.484174 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.486210 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.488530 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.502433 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.520507 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.534122 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.558264 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.558695 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.558708 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.558728 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.558742 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:38Z","lastTransitionTime":"2025-10-09T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.559665 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.580866 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.595526 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.609155 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.625562 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.639088 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.652853 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.662147 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.662257 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.662276 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.662303 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.662320 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:38Z","lastTransitionTime":"2025-10-09T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.665943 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.685557 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.699033 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.711966 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.716799 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.716966 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.717050 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:46:46.71701784 +0000 UTC m=+37.409821848 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.717129 4715 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.717230 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:46.717200055 +0000 UTC m=+37.410004243 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.717309 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.717504 4715 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.717598 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:46.717578086 +0000 UTC m=+37.410382124 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.732924 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.756437 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.765271 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.765321 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.765342 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.765368 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.765384 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:38Z","lastTransitionTime":"2025-10-09T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.777052 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.792210 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.805854 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.818902 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.818957 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.819117 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.819150 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.819166 4715 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.819231 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:46.819207612 +0000 UTC m=+37.512011830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.819260 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.819508 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.819534 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.819548 4715 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:38 crc kubenswrapper[4715]: E1009 07:46:38.819595 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 07:46:46.819578992 +0000 UTC m=+37.512383000 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.832924 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.849683 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.863891 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.867882 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.867916 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.867925 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.867942 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.867953 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:38Z","lastTransitionTime":"2025-10-09T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.892304 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.970459 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.970506 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.970519 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.970537 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:38 crc kubenswrapper[4715]: I1009 07:46:38.970552 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:38Z","lastTransitionTime":"2025-10-09T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.072780 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.072838 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.072851 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.072874 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.073189 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:39Z","lastTransitionTime":"2025-10-09T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.135936 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.135936 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.135960 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:39 crc kubenswrapper[4715]: E1009 07:46:39.136205 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:39 crc kubenswrapper[4715]: E1009 07:46:39.136336 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:39 crc kubenswrapper[4715]: E1009 07:46:39.136593 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.175751 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.175818 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.175838 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.175864 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.175887 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:39Z","lastTransitionTime":"2025-10-09T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.278688 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.278741 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.278757 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.278782 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.278799 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:39Z","lastTransitionTime":"2025-10-09T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.381575 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.381663 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.381690 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.381728 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.381752 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:39Z","lastTransitionTime":"2025-10-09T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.388388 4715 generic.go:334] "Generic (PLEG): container finished" podID="76f34f31-285e-4f90-954d-888a59ad6080" containerID="7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0" exitCode=0 Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.388478 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" event={"ID":"76f34f31-285e-4f90-954d-888a59ad6080","Type":"ContainerDied","Data":"7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0"} Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.388576 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.412441 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.429315 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.457013 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.478201 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.484862 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.484906 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.484921 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.484940 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.484954 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:39Z","lastTransitionTime":"2025-10-09T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.494260 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.512149 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.531713 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.544717 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.557681 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.571333 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.587402 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.587472 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.587485 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.587506 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.587519 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:39Z","lastTransitionTime":"2025-10-09T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.594524 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.618355 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.650913 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.666844 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.680731 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:39Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.689604 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.689647 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.689655 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.689672 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.689684 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:39Z","lastTransitionTime":"2025-10-09T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.792222 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.792270 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.792282 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.792301 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.792313 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:39Z","lastTransitionTime":"2025-10-09T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.895680 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.895724 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.895733 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.895749 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.895760 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:39Z","lastTransitionTime":"2025-10-09T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.999176 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.999233 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.999250 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.999273 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:39 crc kubenswrapper[4715]: I1009 07:46:39.999293 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:39Z","lastTransitionTime":"2025-10-09T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.102115 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.102172 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.102190 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.102216 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.102233 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:40Z","lastTransitionTime":"2025-10-09T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.156477 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.191465 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.205353 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.205384 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.205395 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.205437 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.205449 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:40Z","lastTransitionTime":"2025-10-09T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.230752 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.247881 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.263691 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.279582 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.296337 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.307471 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.307523 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.307536 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.307555 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.307567 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:40Z","lastTransitionTime":"2025-10-09T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.310801 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.324696 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.340698 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.361814 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.376834 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.395813 4715 generic.go:334] "Generic (PLEG): container finished" podID="76f34f31-285e-4f90-954d-888a59ad6080" containerID="e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c" exitCode=0 Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.395902 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" event={"ID":"76f34f31-285e-4f90-954d-888a59ad6080","Type":"ContainerDied","Data":"e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c"} Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.395985 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.397239 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.409408 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.409610 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.409765 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.409931 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.410047 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:40Z","lastTransitionTime":"2025-10-09T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.413244 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.428937 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.442648 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.457491 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.469552 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.496387 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.508689 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.512902 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.512940 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.512948 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.512963 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.512973 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:40Z","lastTransitionTime":"2025-10-09T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.530596 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.542968 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.557155 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.569179 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.581127 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.590570 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.601601 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.613799 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.615458 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.615495 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.615532 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.615552 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.615563 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:40Z","lastTransitionTime":"2025-10-09T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.636715 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.650599 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.717927 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.717961 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.717970 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.717985 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.717994 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:40Z","lastTransitionTime":"2025-10-09T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.820903 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.820952 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.820961 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.820978 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.820987 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:40Z","lastTransitionTime":"2025-10-09T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.924818 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.924871 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.924888 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.924914 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:40 crc kubenswrapper[4715]: I1009 07:46:40.924936 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:40Z","lastTransitionTime":"2025-10-09T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.027507 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.027559 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.027573 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.027594 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.027611 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:41Z","lastTransitionTime":"2025-10-09T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.129572 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.130006 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.130083 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.130162 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.130237 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:41Z","lastTransitionTime":"2025-10-09T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.135940 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:41 crc kubenswrapper[4715]: E1009 07:46:41.136147 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.135933 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:41 crc kubenswrapper[4715]: E1009 07:46:41.136396 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.135933 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:41 crc kubenswrapper[4715]: E1009 07:46:41.136703 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.232856 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.232905 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.232916 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.232932 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.232942 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:41Z","lastTransitionTime":"2025-10-09T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.336475 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.336543 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.336587 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.336621 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.336644 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:41Z","lastTransitionTime":"2025-10-09T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.401212 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/0.log" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.405131 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c" exitCode=1 Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.405196 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.406178 4715 scope.go:117] "RemoveContainer" containerID="7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.410558 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" event={"ID":"76f34f31-285e-4f90-954d-888a59ad6080","Type":"ContainerStarted","Data":"4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.422157 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.437840 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.440054 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.440114 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.440127 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.440147 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.440160 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:41Z","lastTransitionTime":"2025-10-09T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.462965 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.484625 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.503055 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.528464 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.542895 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.542934 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.542944 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.542961 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.542972 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:41Z","lastTransitionTime":"2025-10-09T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.545201 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.567776 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.597872 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"message\\\":\\\"946 5943 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 07:46:41.002971 5943 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 07:46:41.002987 5943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1009 07:46:41.003185 5943 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 07:46:41.006848 5943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 07:46:41.006886 5943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 07:46:41.006892 5943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 07:46:41.006915 5943 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 07:46:41.006920 5943 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 07:46:41.006924 5943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 07:46:41.006955 5943 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 07:46:41.006959 5943 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 07:46:41.006970 5943 factory.go:656] Stopping watch factory\\\\nI1009 07:46:41.006972 5943 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 07:46:41.006985 5943 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.613946 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.635267 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.645709 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.645755 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.645769 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.645790 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.645806 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:41Z","lastTransitionTime":"2025-10-09T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.649478 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.663513 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.681620 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.702637 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.719797 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.749459 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.749531 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.749555 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.749589 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.749612 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:41Z","lastTransitionTime":"2025-10-09T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.756305 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.780166 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.803209 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.823584 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.840684 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.851967 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.852011 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.852024 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.852042 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.852055 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:41Z","lastTransitionTime":"2025-10-09T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.854161 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.867971 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.884523 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.901446 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.932370 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"message\\\":\\\"946 5943 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 07:46:41.002971 5943 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 07:46:41.002987 5943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1009 07:46:41.003185 5943 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 07:46:41.006848 5943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 07:46:41.006886 5943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 07:46:41.006892 5943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 07:46:41.006915 5943 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 07:46:41.006920 5943 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 07:46:41.006924 5943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 07:46:41.006955 5943 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 07:46:41.006959 5943 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 07:46:41.006970 5943 factory.go:656] Stopping watch factory\\\\nI1009 07:46:41.006972 5943 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 07:46:41.006985 5943 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.948698 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.954337 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.954377 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.954388 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.954411 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.954442 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:41Z","lastTransitionTime":"2025-10-09T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.966653 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:41 crc kubenswrapper[4715]: I1009 07:46:41.990606 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.000973 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:41Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.057126 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.057464 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.057611 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.057818 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.057961 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:42Z","lastTransitionTime":"2025-10-09T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.160546 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.160922 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.161079 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.161279 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.161407 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:42Z","lastTransitionTime":"2025-10-09T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.263840 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.263883 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.263894 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.263910 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.263923 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:42Z","lastTransitionTime":"2025-10-09T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.366000 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.366065 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.366084 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.366114 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.366137 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:42Z","lastTransitionTime":"2025-10-09T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.416440 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/0.log" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.419705 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258"} Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.420222 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.434412 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.451131 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.469694 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.469757 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.469773 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.469797 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.469815 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:42Z","lastTransitionTime":"2025-10-09T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.472616 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.485673 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.497175 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.507549 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.528082 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.541655 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.556287 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.570410 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.572265 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.572291 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.572303 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.572319 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.572329 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:42Z","lastTransitionTime":"2025-10-09T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.582205 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.594924 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.608791 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.626039 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.648101 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"message\\\":\\\"946 5943 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 07:46:41.002971 5943 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 07:46:41.002987 5943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1009 07:46:41.003185 5943 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 07:46:41.006848 5943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 07:46:41.006886 5943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 07:46:41.006892 5943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 07:46:41.006915 5943 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 07:46:41.006920 5943 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 07:46:41.006924 5943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 07:46:41.006955 5943 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 07:46:41.006959 5943 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 07:46:41.006970 5943 factory.go:656] Stopping watch factory\\\\nI1009 07:46:41.006972 5943 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 07:46:41.006985 5943 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:42Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.675231 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.675277 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.675289 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.675305 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.675317 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:42Z","lastTransitionTime":"2025-10-09T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.777682 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.777784 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.777802 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.777831 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.777849 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:42Z","lastTransitionTime":"2025-10-09T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.880179 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.880229 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.880241 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.880260 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.880277 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:42Z","lastTransitionTime":"2025-10-09T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.983925 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.984032 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.984057 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.984089 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:42 crc kubenswrapper[4715]: I1009 07:46:42.984112 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:42Z","lastTransitionTime":"2025-10-09T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.086845 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.086900 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.086912 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.086932 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.086945 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:43Z","lastTransitionTime":"2025-10-09T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.136494 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.136560 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.136650 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:43 crc kubenswrapper[4715]: E1009 07:46:43.136770 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:43 crc kubenswrapper[4715]: E1009 07:46:43.136908 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:43 crc kubenswrapper[4715]: E1009 07:46:43.137168 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.189270 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.189353 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.189369 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.189392 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.189410 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:43Z","lastTransitionTime":"2025-10-09T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.292275 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.292350 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.292385 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.292448 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.292473 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:43Z","lastTransitionTime":"2025-10-09T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.395677 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.395747 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.395760 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.395780 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.395793 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:43Z","lastTransitionTime":"2025-10-09T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.424827 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/1.log" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.425646 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/0.log" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.428539 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258" exitCode=1 Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.428586 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258"} Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.428627 4715 scope.go:117] "RemoveContainer" containerID="7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.429838 4715 scope.go:117] "RemoveContainer" containerID="6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258" Oct 09 07:46:43 crc kubenswrapper[4715]: E1009 07:46:43.430131 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.447227 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.466104 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.496854 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"message\\\":\\\"946 5943 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 07:46:41.002971 5943 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 07:46:41.002987 5943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1009 07:46:41.003185 5943 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 07:46:41.006848 5943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 07:46:41.006886 5943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 07:46:41.006892 5943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 07:46:41.006915 5943 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 07:46:41.006920 5943 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 07:46:41.006924 5943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 07:46:41.006955 5943 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 07:46:41.006959 5943 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 07:46:41.006970 5943 factory.go:656] Stopping watch factory\\\\nI1009 07:46:41.006972 5943 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 07:46:41.006985 5943 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:42Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 07:46:42.692267 6122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 07:46:42.692108 6122 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1009 07:46:42.692439 6122 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1009 07:46:42.692454 6122 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.506102ms\\\\nI1009 07:46:42.692466 6122 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1009 07:46:42.692467 6122 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nF1009 07:46:42.692492 6122 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.499028 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.499074 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.499086 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.499103 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.499112 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:43Z","lastTransitionTime":"2025-10-09T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.512382 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.531532 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.542604 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.554622 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.572741 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.584469 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.601835 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.601871 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.601882 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.601899 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.601911 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:43Z","lastTransitionTime":"2025-10-09T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.614506 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.630919 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.647931 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.663262 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.680755 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.704567 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.704637 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.704656 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.704681 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.704699 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:43Z","lastTransitionTime":"2025-10-09T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.704722 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:43Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.807506 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.807579 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.807597 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.807626 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.807644 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:43Z","lastTransitionTime":"2025-10-09T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.910581 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.910629 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.910641 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.910657 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:43 crc kubenswrapper[4715]: I1009 07:46:43.910679 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:43Z","lastTransitionTime":"2025-10-09T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.014631 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.014684 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.014697 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.014716 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.014729 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:44Z","lastTransitionTime":"2025-10-09T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.117134 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.117179 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.117190 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.117205 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.117214 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:44Z","lastTransitionTime":"2025-10-09T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.220109 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.220448 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.220462 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.220481 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.220495 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:44Z","lastTransitionTime":"2025-10-09T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.323851 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.323921 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.323940 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.323967 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.323986 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:44Z","lastTransitionTime":"2025-10-09T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.375259 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn"] Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.375924 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.379650 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.379656 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.396489 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.399971 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ksbvn\" (UID: \"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.400030 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ksbvn\" (UID: \"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.400072 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ksbvn\" (UID: \"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.400145 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97crn\" (UniqueName: \"kubernetes.io/projected/dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f-kube-api-access-97crn\") pod \"ovnkube-control-plane-749d76644c-ksbvn\" (UID: \"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.419583 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.426309 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.426373 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.426393 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.426463 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.426488 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:44Z","lastTransitionTime":"2025-10-09T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.434620 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/1.log" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.437473 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.470560 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.487875 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.500859 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ksbvn\" (UID: \"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.500918 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ksbvn\" (UID: \"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.500956 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ksbvn\" (UID: \"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.501001 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97crn\" (UniqueName: \"kubernetes.io/projected/dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f-kube-api-access-97crn\") pod \"ovnkube-control-plane-749d76644c-ksbvn\" (UID: \"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.501833 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ksbvn\" (UID: \"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.501913 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ksbvn\" (UID: \"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.505286 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.507860 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ksbvn\" (UID: \"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.517214 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97crn\" (UniqueName: \"kubernetes.io/projected/dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f-kube-api-access-97crn\") pod \"ovnkube-control-plane-749d76644c-ksbvn\" (UID: \"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.519868 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.529303 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.529365 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.529381 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.529403 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.529415 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:44Z","lastTransitionTime":"2025-10-09T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.533475 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.546854 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.562657 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.578739 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.598273 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"message\\\":\\\"946 5943 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 07:46:41.002971 5943 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 07:46:41.002987 5943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1009 07:46:41.003185 5943 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 07:46:41.006848 5943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 07:46:41.006886 5943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 07:46:41.006892 5943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 07:46:41.006915 5943 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 07:46:41.006920 5943 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 07:46:41.006924 5943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 07:46:41.006955 5943 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 07:46:41.006959 5943 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 07:46:41.006970 5943 factory.go:656] Stopping watch factory\\\\nI1009 07:46:41.006972 5943 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 07:46:41.006985 5943 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:42Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 07:46:42.692267 6122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 07:46:42.692108 6122 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1009 07:46:42.692439 6122 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1009 07:46:42.692454 6122 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.506102ms\\\\nI1009 07:46:42.692466 6122 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1009 07:46:42.692467 6122 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nF1009 07:46:42.692492 6122 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.613014 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.627313 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.631845 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.631893 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.631905 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.631925 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.631939 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:44Z","lastTransitionTime":"2025-10-09T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.642159 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.652290 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:44Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.699213 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" Oct 09 07:46:44 crc kubenswrapper[4715]: W1009 07:46:44.715100 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd48d949_08f9_4a54_ae1c_fe0cfbbcf08f.slice/crio-db163de2de599764eaff712fbfab7e56cff0a222b2a545fa6239c55698e69da8 WatchSource:0}: Error finding container db163de2de599764eaff712fbfab7e56cff0a222b2a545fa6239c55698e69da8: Status 404 returned error can't find the container with id db163de2de599764eaff712fbfab7e56cff0a222b2a545fa6239c55698e69da8 Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.735102 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.735158 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.735169 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.735192 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.735210 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:44Z","lastTransitionTime":"2025-10-09T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.838083 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.838119 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.838128 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.838144 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.838155 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:44Z","lastTransitionTime":"2025-10-09T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.940106 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.940138 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.940147 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.940161 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:44 crc kubenswrapper[4715]: I1009 07:46:44.940171 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:44Z","lastTransitionTime":"2025-10-09T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.044296 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.044354 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.044413 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.044453 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.044467 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:45Z","lastTransitionTime":"2025-10-09T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.136825 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.136959 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:45 crc kubenswrapper[4715]: E1009 07:46:45.137004 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:45 crc kubenswrapper[4715]: E1009 07:46:45.137185 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.137414 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:45 crc kubenswrapper[4715]: E1009 07:46:45.137525 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.152636 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.152669 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.152677 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.152692 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.152701 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:45Z","lastTransitionTime":"2025-10-09T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.256057 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.256099 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.256114 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.256134 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.256147 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:45Z","lastTransitionTime":"2025-10-09T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.358889 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.358945 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.358964 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.358991 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.359008 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:45Z","lastTransitionTime":"2025-10-09T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.449929 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" event={"ID":"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f","Type":"ContainerStarted","Data":"f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.449993 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" event={"ID":"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f","Type":"ContainerStarted","Data":"2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.450008 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" event={"ID":"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f","Type":"ContainerStarted","Data":"db163de2de599764eaff712fbfab7e56cff0a222b2a545fa6239c55698e69da8"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.461822 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.461882 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.461896 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.461915 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.461928 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:45Z","lastTransitionTime":"2025-10-09T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.462720 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.477523 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.493064 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.521106 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-fm6s2"] Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.521687 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:45 crc kubenswrapper[4715]: E1009 07:46:45.521768 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.539671 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.560593 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.565054 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.565087 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.565099 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.565116 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.565128 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:45Z","lastTransitionTime":"2025-10-09T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.578474 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.599820 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.613364 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbsl8\" (UniqueName: \"kubernetes.io/projected/9a8fb3b8-b254-4bc3-b105-990eac79c77b-kube-api-access-pbsl8\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.613467 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.614494 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.627767 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.639940 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.650987 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.662961 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.667295 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.667342 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.667356 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.667377 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.667392 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:45Z","lastTransitionTime":"2025-10-09T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.677719 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.694296 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.714537 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbsl8\" (UniqueName: \"kubernetes.io/projected/9a8fb3b8-b254-4bc3-b105-990eac79c77b-kube-api-access-pbsl8\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.714610 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:45 crc kubenswrapper[4715]: E1009 07:46:45.714723 4715 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:46:45 crc kubenswrapper[4715]: E1009 07:46:45.714774 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs podName:9a8fb3b8-b254-4bc3-b105-990eac79c77b nodeName:}" failed. No retries permitted until 2025-10-09 07:46:46.214758505 +0000 UTC m=+36.907562523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs") pod "network-metrics-daemon-fm6s2" (UID: "9a8fb3b8-b254-4bc3-b105-990eac79c77b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.716135 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"message\\\":\\\"946 5943 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 07:46:41.002971 5943 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 07:46:41.002987 5943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1009 07:46:41.003185 5943 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 07:46:41.006848 5943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 07:46:41.006886 5943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 07:46:41.006892 5943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 07:46:41.006915 5943 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 07:46:41.006920 5943 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 07:46:41.006924 5943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 07:46:41.006955 5943 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 07:46:41.006959 5943 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 07:46:41.006970 5943 factory.go:656] Stopping watch factory\\\\nI1009 07:46:41.006972 5943 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 07:46:41.006985 5943 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:42Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 07:46:42.692267 6122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 07:46:42.692108 6122 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1009 07:46:42.692439 6122 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1009 07:46:42.692454 6122 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.506102ms\\\\nI1009 07:46:42.692466 6122 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1009 07:46:42.692467 6122 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nF1009 07:46:42.692492 6122 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.728982 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.734077 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbsl8\" (UniqueName: \"kubernetes.io/projected/9a8fb3b8-b254-4bc3-b105-990eac79c77b-kube-api-access-pbsl8\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.740813 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.753747 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.769753 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.769793 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.769805 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.769822 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.769834 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:45Z","lastTransitionTime":"2025-10-09T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.777363 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.790316 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.801601 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.815720 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.828227 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.839719 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.850455 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.864450 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.871597 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.871638 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.871650 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.871669 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.871682 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:45Z","lastTransitionTime":"2025-10-09T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.881286 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.898344 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"message\\\":\\\"946 5943 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 07:46:41.002971 5943 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 07:46:41.002987 5943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1009 07:46:41.003185 5943 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 07:46:41.006848 5943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 07:46:41.006886 5943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 07:46:41.006892 5943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 07:46:41.006915 5943 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 07:46:41.006920 5943 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 07:46:41.006924 5943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 07:46:41.006955 5943 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 07:46:41.006959 5943 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 07:46:41.006970 5943 factory.go:656] Stopping watch factory\\\\nI1009 07:46:41.006972 5943 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 07:46:41.006985 5943 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:42Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 07:46:42.692267 6122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 07:46:42.692108 6122 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1009 07:46:42.692439 6122 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1009 07:46:42.692454 6122 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.506102ms\\\\nI1009 07:46:42.692466 6122 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1009 07:46:42.692467 6122 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nF1009 07:46:42.692492 6122 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.909795 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.923383 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.935348 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.944105 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.952879 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:45Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.973914 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.973962 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.973974 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.973994 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:45 crc kubenswrapper[4715]: I1009 07:46:45.974008 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:45Z","lastTransitionTime":"2025-10-09T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.077332 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.077384 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.077399 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.077446 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.077462 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.180236 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.180313 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.180345 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.180380 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.180402 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.220800 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.221026 4715 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.221175 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs podName:9a8fb3b8-b254-4bc3-b105-990eac79c77b nodeName:}" failed. No retries permitted until 2025-10-09 07:46:47.221146264 +0000 UTC m=+37.913950262 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs") pod "network-metrics-daemon-fm6s2" (UID: "9a8fb3b8-b254-4bc3-b105-990eac79c77b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.283531 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.283604 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.283627 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.283657 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.283681 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.386665 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.386733 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.386752 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.386779 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.386797 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.453068 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.453495 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.453696 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.453936 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.454146 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.475515 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:46Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.482638 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.482683 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.482692 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.482710 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.482725 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.502086 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:46Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.507168 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.507250 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.507276 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.507312 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.507339 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.529610 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:46Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.535218 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.535262 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.535271 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.535287 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.535300 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.556826 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:46Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.560917 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.560974 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.561003 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.561020 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.561028 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.575798 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:46Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.575913 4715 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.577891 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.577929 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.577938 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.577957 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.577967 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.680265 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.680656 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.680781 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.680889 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.680982 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.726981 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.727158 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.727201 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.727368 4715 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.727475 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:47:02.727453092 +0000 UTC m=+53.420257110 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.727661 4715 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.727766 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:47:02.72773147 +0000 UTC m=+53.420535468 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.727844 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:47:02.727835023 +0000 UTC m=+53.420639021 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.784780 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.784849 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.784866 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.784891 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.784909 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.828188 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.828377 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.828546 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.828616 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.828636 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.828642 4715 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.828673 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.828698 4715 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.828770 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 07:47:02.828740579 +0000 UTC m=+53.521544627 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:46 crc kubenswrapper[4715]: E1009 07:46:46.828803 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 07:47:02.82878966 +0000 UTC m=+53.521593708 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.888052 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.888118 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.888135 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.888160 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.888177 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.991380 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.991481 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.991503 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.991528 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:46 crc kubenswrapper[4715]: I1009 07:46:46.991560 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:46Z","lastTransitionTime":"2025-10-09T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.094637 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.094702 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.094718 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.094738 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.094754 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:47Z","lastTransitionTime":"2025-10-09T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.136724 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.136743 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.136741 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.136769 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:47 crc kubenswrapper[4715]: E1009 07:46:47.136973 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:46:47 crc kubenswrapper[4715]: E1009 07:46:47.137203 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:47 crc kubenswrapper[4715]: E1009 07:46:47.137342 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:47 crc kubenswrapper[4715]: E1009 07:46:47.137518 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.198866 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.198927 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.198944 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.198965 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.198982 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:47Z","lastTransitionTime":"2025-10-09T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.232891 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:47 crc kubenswrapper[4715]: E1009 07:46:47.233051 4715 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:46:47 crc kubenswrapper[4715]: E1009 07:46:47.233109 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs podName:9a8fb3b8-b254-4bc3-b105-990eac79c77b nodeName:}" failed. No retries permitted until 2025-10-09 07:46:49.233095292 +0000 UTC m=+39.925899290 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs") pod "network-metrics-daemon-fm6s2" (UID: "9a8fb3b8-b254-4bc3-b105-990eac79c77b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.302080 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.302116 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.302126 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.302142 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.302152 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:47Z","lastTransitionTime":"2025-10-09T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.406882 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.406931 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.406944 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.406960 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.406976 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:47Z","lastTransitionTime":"2025-10-09T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.510766 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.510809 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.510818 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.510833 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.510844 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:47Z","lastTransitionTime":"2025-10-09T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.613800 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.614622 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.614668 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.614695 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.614711 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:47Z","lastTransitionTime":"2025-10-09T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.718511 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.718661 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.718708 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.718733 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.718749 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:47Z","lastTransitionTime":"2025-10-09T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.822291 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.822366 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.822389 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.822455 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.822483 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:47Z","lastTransitionTime":"2025-10-09T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.924328 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.924396 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.924413 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.924590 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:47 crc kubenswrapper[4715]: I1009 07:46:47.924614 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:47Z","lastTransitionTime":"2025-10-09T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.028889 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.028946 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.028957 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.028976 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.028988 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:48Z","lastTransitionTime":"2025-10-09T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.132161 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.132209 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.132221 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.132240 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.132252 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:48Z","lastTransitionTime":"2025-10-09T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.235403 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.235509 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.235529 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.235559 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.235580 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:48Z","lastTransitionTime":"2025-10-09T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.338349 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.338460 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.338478 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.338505 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.338523 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:48Z","lastTransitionTime":"2025-10-09T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.441089 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.441141 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.441152 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.441170 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.441182 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:48Z","lastTransitionTime":"2025-10-09T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.544003 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.544063 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.544073 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.544091 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.544106 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:48Z","lastTransitionTime":"2025-10-09T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.647900 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.647947 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.647959 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.647978 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.647991 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:48Z","lastTransitionTime":"2025-10-09T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.751166 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.751228 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.751248 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.751275 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.751299 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:48Z","lastTransitionTime":"2025-10-09T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.854587 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.854643 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.854656 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.854676 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.854688 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:48Z","lastTransitionTime":"2025-10-09T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.958266 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.958346 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.958372 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.958403 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:48 crc kubenswrapper[4715]: I1009 07:46:48.958460 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:48Z","lastTransitionTime":"2025-10-09T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.062010 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.062073 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.062092 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.062113 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.062131 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:49Z","lastTransitionTime":"2025-10-09T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.136231 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.136324 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.136343 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.136347 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:49 crc kubenswrapper[4715]: E1009 07:46:49.136500 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:49 crc kubenswrapper[4715]: E1009 07:46:49.136603 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:49 crc kubenswrapper[4715]: E1009 07:46:49.136705 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:46:49 crc kubenswrapper[4715]: E1009 07:46:49.136995 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.137476 4715 scope.go:117] "RemoveContainer" containerID="f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.166331 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.166752 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.166764 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.166782 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.166792 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:49Z","lastTransitionTime":"2025-10-09T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.255799 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:49 crc kubenswrapper[4715]: E1009 07:46:49.255973 4715 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:46:49 crc kubenswrapper[4715]: E1009 07:46:49.256032 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs podName:9a8fb3b8-b254-4bc3-b105-990eac79c77b nodeName:}" failed. No retries permitted until 2025-10-09 07:46:53.256015841 +0000 UTC m=+43.948819849 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs") pod "network-metrics-daemon-fm6s2" (UID: "9a8fb3b8-b254-4bc3-b105-990eac79c77b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.269527 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.269557 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.269565 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.269580 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.269589 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:49Z","lastTransitionTime":"2025-10-09T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.371308 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.371352 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.371363 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.371381 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.371392 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:49Z","lastTransitionTime":"2025-10-09T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.467650 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.469861 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9"} Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.470282 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.473446 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.473494 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.473506 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.473526 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.473541 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:49Z","lastTransitionTime":"2025-10-09T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.484921 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.500261 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.513266 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.531602 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.545974 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.560266 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.573735 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.576200 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.576291 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.576305 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.576326 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.576339 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:49Z","lastTransitionTime":"2025-10-09T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.590655 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.603711 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.617455 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.640974 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.655540 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.668671 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.678479 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.678520 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.678530 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.678547 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.678558 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:49Z","lastTransitionTime":"2025-10-09T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.686410 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"message\\\":\\\"946 5943 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 07:46:41.002971 5943 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 07:46:41.002987 5943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1009 07:46:41.003185 5943 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 07:46:41.006848 5943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 07:46:41.006886 5943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 07:46:41.006892 5943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 07:46:41.006915 5943 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 07:46:41.006920 5943 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 07:46:41.006924 5943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 07:46:41.006955 5943 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 07:46:41.006959 5943 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 07:46:41.006970 5943 factory.go:656] Stopping watch factory\\\\nI1009 07:46:41.006972 5943 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 07:46:41.006985 5943 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:42Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 07:46:42.692267 6122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 07:46:42.692108 6122 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1009 07:46:42.692439 6122 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1009 07:46:42.692454 6122 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.506102ms\\\\nI1009 07:46:42.692466 6122 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1009 07:46:42.692467 6122 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nF1009 07:46:42.692492 6122 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.700310 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.713169 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.730750 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:49Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.781009 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.781083 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.781102 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.781126 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.781145 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:49Z","lastTransitionTime":"2025-10-09T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.883889 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.883945 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.883961 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.883986 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.884005 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:49Z","lastTransitionTime":"2025-10-09T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.987744 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.987791 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.987801 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.987820 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:49 crc kubenswrapper[4715]: I1009 07:46:49.987831 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:49Z","lastTransitionTime":"2025-10-09T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.091026 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.091086 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.091104 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.091132 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.091150 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:50Z","lastTransitionTime":"2025-10-09T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.155515 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.170078 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.191777 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.193453 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.193520 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.193537 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.193559 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.193572 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:50Z","lastTransitionTime":"2025-10-09T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.213446 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.230243 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.246348 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.261730 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.290745 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.297997 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.298053 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.298068 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.298089 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.298103 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:50Z","lastTransitionTime":"2025-10-09T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.307264 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.323707 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.343672 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7898580fa0e3f40e26a6aa8da1a4997577bb4e2e5627df3689ddfd90b720890c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"message\\\":\\\"946 5943 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 07:46:41.002971 5943 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 07:46:41.002987 5943 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1009 07:46:41.003185 5943 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1009 07:46:41.006848 5943 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1009 07:46:41.006886 5943 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1009 07:46:41.006892 5943 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1009 07:46:41.006915 5943 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 07:46:41.006920 5943 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 07:46:41.006924 5943 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 07:46:41.006955 5943 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 07:46:41.006959 5943 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 07:46:41.006970 5943 factory.go:656] Stopping watch factory\\\\nI1009 07:46:41.006972 5943 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 07:46:41.006985 5943 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:42Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 07:46:42.692267 6122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 07:46:42.692108 6122 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1009 07:46:42.692439 6122 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1009 07:46:42.692454 6122 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.506102ms\\\\nI1009 07:46:42.692466 6122 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1009 07:46:42.692467 6122 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nF1009 07:46:42.692492 6122 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.356198 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.369295 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.378274 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.386980 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.398108 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.400093 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.400146 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.400158 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.400175 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.400188 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:50Z","lastTransitionTime":"2025-10-09T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.408879 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:50Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.503281 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.503344 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.503361 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.503389 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.503406 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:50Z","lastTransitionTime":"2025-10-09T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.607093 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.607148 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.607161 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.607183 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.607201 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:50Z","lastTransitionTime":"2025-10-09T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.710465 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.710546 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.710569 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.710601 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.710627 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:50Z","lastTransitionTime":"2025-10-09T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.814152 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.814209 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.814228 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.814251 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.814275 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:50Z","lastTransitionTime":"2025-10-09T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.917766 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.917833 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.917856 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.917886 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:50 crc kubenswrapper[4715]: I1009 07:46:50.917906 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:50Z","lastTransitionTime":"2025-10-09T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.020811 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.020889 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.020914 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.020940 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.020958 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:51Z","lastTransitionTime":"2025-10-09T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.124172 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.124218 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.124229 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.124247 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.124257 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:51Z","lastTransitionTime":"2025-10-09T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.136624 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.136697 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.136653 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:51 crc kubenswrapper[4715]: E1009 07:46:51.136806 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:51 crc kubenswrapper[4715]: E1009 07:46:51.136939 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.136966 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:51 crc kubenswrapper[4715]: E1009 07:46:51.137055 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:51 crc kubenswrapper[4715]: E1009 07:46:51.137127 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.227125 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.227167 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.227175 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.227192 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.227203 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:51Z","lastTransitionTime":"2025-10-09T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.330185 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.330265 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.330285 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.330335 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.330360 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:51Z","lastTransitionTime":"2025-10-09T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.433682 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.433757 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.433779 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.433805 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.433823 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:51Z","lastTransitionTime":"2025-10-09T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.536503 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.536553 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.536564 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.536580 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.536590 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:51Z","lastTransitionTime":"2025-10-09T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.639911 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.639996 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.640021 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.640059 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.640086 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:51Z","lastTransitionTime":"2025-10-09T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.742842 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.742907 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.742924 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.742949 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.742967 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:51Z","lastTransitionTime":"2025-10-09T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.790796 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.791784 4715 scope.go:117] "RemoveContainer" containerID="6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258" Oct 09 07:46:51 crc kubenswrapper[4715]: E1009 07:46:51.791977 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.812210 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:51Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.829589 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:51Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.846353 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.846438 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.846453 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.846485 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.846500 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:51Z","lastTransitionTime":"2025-10-09T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.848880 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:51Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.864608 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:51Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.879467 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:51Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.894826 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:51Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.912381 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:51Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.943619 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:51Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.949273 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.949326 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.949339 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.949358 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.949373 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:51Z","lastTransitionTime":"2025-10-09T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.966686 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:51Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:51 crc kubenswrapper[4715]: I1009 07:46:51.983000 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:51Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.017914 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:42Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 07:46:42.692267 6122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 07:46:42.692108 6122 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1009 07:46:42.692439 6122 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1009 07:46:42.692454 6122 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.506102ms\\\\nI1009 07:46:42.692466 6122 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1009 07:46:42.692467 6122 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nF1009 07:46:42.692492 6122 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:52Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.036850 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:52Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.051738 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.051823 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.051850 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.051898 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.051924 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:52Z","lastTransitionTime":"2025-10-09T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.053921 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:52Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.067471 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:52Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.082805 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:52Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.100161 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:52Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.113927 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:52Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.154554 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.154642 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.154698 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.154721 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.154740 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:52Z","lastTransitionTime":"2025-10-09T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.257767 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.257869 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.257887 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.257912 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.257931 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:52Z","lastTransitionTime":"2025-10-09T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.360713 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.360771 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.360791 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.360816 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.360833 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:52Z","lastTransitionTime":"2025-10-09T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.464288 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.464353 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.464371 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.464394 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.464412 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:52Z","lastTransitionTime":"2025-10-09T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.567480 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.567535 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.567548 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.567567 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.567594 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:52Z","lastTransitionTime":"2025-10-09T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.670913 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.671003 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.671035 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.671069 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.671091 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:52Z","lastTransitionTime":"2025-10-09T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.774929 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.775028 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.775042 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.775067 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.775082 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:52Z","lastTransitionTime":"2025-10-09T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.877749 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.878069 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.878158 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.878271 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.878354 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:52Z","lastTransitionTime":"2025-10-09T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.981233 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.981280 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.981291 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.981312 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:52 crc kubenswrapper[4715]: I1009 07:46:52.981325 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:52Z","lastTransitionTime":"2025-10-09T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.084581 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.084628 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.084639 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.084658 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.084669 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:53Z","lastTransitionTime":"2025-10-09T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.136602 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.136666 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.136671 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.136622 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:53 crc kubenswrapper[4715]: E1009 07:46:53.136806 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:53 crc kubenswrapper[4715]: E1009 07:46:53.136885 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:53 crc kubenswrapper[4715]: E1009 07:46:53.136989 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:53 crc kubenswrapper[4715]: E1009 07:46:53.137064 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.187367 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.187462 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.187475 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.187496 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.187518 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:53Z","lastTransitionTime":"2025-10-09T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.293670 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.293740 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.293777 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.293838 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.293864 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:53Z","lastTransitionTime":"2025-10-09T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.303821 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:53 crc kubenswrapper[4715]: E1009 07:46:53.304017 4715 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:46:53 crc kubenswrapper[4715]: E1009 07:46:53.304113 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs podName:9a8fb3b8-b254-4bc3-b105-990eac79c77b nodeName:}" failed. No retries permitted until 2025-10-09 07:47:01.304090167 +0000 UTC m=+51.996894205 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs") pod "network-metrics-daemon-fm6s2" (UID: "9a8fb3b8-b254-4bc3-b105-990eac79c77b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.398821 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.398889 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.398907 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.398933 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.398951 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:53Z","lastTransitionTime":"2025-10-09T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.502610 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.503081 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.503241 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.503475 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.503672 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:53Z","lastTransitionTime":"2025-10-09T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.606983 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.607047 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.607065 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.607090 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.607108 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:53Z","lastTransitionTime":"2025-10-09T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.709942 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.710002 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.710020 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.710044 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.710061 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:53Z","lastTransitionTime":"2025-10-09T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.812896 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.812972 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.812998 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.813026 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.813046 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:53Z","lastTransitionTime":"2025-10-09T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.916185 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.916249 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.916267 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.916294 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:53 crc kubenswrapper[4715]: I1009 07:46:53.916313 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:53Z","lastTransitionTime":"2025-10-09T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.020075 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.020139 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.020157 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.020185 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.020203 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:54Z","lastTransitionTime":"2025-10-09T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.123764 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.123990 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.124009 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.124035 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.124056 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:54Z","lastTransitionTime":"2025-10-09T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.226955 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.227000 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.227009 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.227025 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.227035 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:54Z","lastTransitionTime":"2025-10-09T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.329158 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.329186 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.329195 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.329209 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.329217 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:54Z","lastTransitionTime":"2025-10-09T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.432920 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.432990 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.433014 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.433047 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.433074 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:54Z","lastTransitionTime":"2025-10-09T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.536262 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.536346 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.536366 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.536389 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.536408 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:54Z","lastTransitionTime":"2025-10-09T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.639137 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.639227 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.639286 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.639319 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.639347 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:54Z","lastTransitionTime":"2025-10-09T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.742767 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.742826 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.742843 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.742866 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.742884 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:54Z","lastTransitionTime":"2025-10-09T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.845256 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.845334 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.845354 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.845382 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.845399 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:54Z","lastTransitionTime":"2025-10-09T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.949795 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.949872 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.949895 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.949925 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:54 crc kubenswrapper[4715]: I1009 07:46:54.949949 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:54Z","lastTransitionTime":"2025-10-09T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.053624 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.053697 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.053719 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.053752 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.053774 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:55Z","lastTransitionTime":"2025-10-09T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.136283 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.136322 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.136335 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.136317 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:55 crc kubenswrapper[4715]: E1009 07:46:55.136535 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:46:55 crc kubenswrapper[4715]: E1009 07:46:55.136711 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:55 crc kubenswrapper[4715]: E1009 07:46:55.136848 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:55 crc kubenswrapper[4715]: E1009 07:46:55.136971 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.156954 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.157008 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.157023 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.157042 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.157058 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:55Z","lastTransitionTime":"2025-10-09T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.260322 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.260416 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.260497 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.260524 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.260542 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:55Z","lastTransitionTime":"2025-10-09T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.363395 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.363836 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.363878 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.363907 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.363926 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:55Z","lastTransitionTime":"2025-10-09T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.467236 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.467292 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.467308 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.467327 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.467346 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:55Z","lastTransitionTime":"2025-10-09T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.570846 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.570907 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.570923 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.570951 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.570969 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:55Z","lastTransitionTime":"2025-10-09T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.674233 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.674332 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.674370 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.674404 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.674464 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:55Z","lastTransitionTime":"2025-10-09T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.777683 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.777743 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.777760 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.777789 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.777807 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:55Z","lastTransitionTime":"2025-10-09T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.885087 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.885159 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.885178 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.885210 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.885227 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:55Z","lastTransitionTime":"2025-10-09T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.989320 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.989465 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.989493 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.989529 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:55 crc kubenswrapper[4715]: I1009 07:46:55.989557 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:55Z","lastTransitionTime":"2025-10-09T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.092461 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.092547 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.092572 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.092603 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.092622 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.195896 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.195981 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.196003 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.196028 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.196046 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.299305 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.299356 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.299369 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.299388 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.299406 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.402633 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.402713 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.402736 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.402769 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.402790 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.505803 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.505847 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.505859 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.505877 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.505889 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.609199 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.609257 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.609277 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.609302 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.609319 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.713326 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.713379 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.713398 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.713459 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.713478 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.778163 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.778227 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.778249 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.778283 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.778305 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: E1009 07:46:56.797672 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:56Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.802993 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.803042 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.803056 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.803080 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.803096 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: E1009 07:46:56.824171 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:56Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.828694 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.828729 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.828740 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.828768 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.828782 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: E1009 07:46:56.843392 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:56Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.849088 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.849140 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.849155 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.849175 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.849188 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: E1009 07:46:56.869554 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:56Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.874551 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.874628 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.874637 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.874661 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.874678 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: E1009 07:46:56.891782 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:56Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:56 crc kubenswrapper[4715]: E1009 07:46:56.891918 4715 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.894234 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.894267 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.894277 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.894296 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.894307 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.997922 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.997981 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.997996 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.998017 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:56 crc kubenswrapper[4715]: I1009 07:46:56.998032 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:56Z","lastTransitionTime":"2025-10-09T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.101102 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.101160 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.101173 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.101201 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.101219 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:57Z","lastTransitionTime":"2025-10-09T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.136603 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.136711 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.136736 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.136711 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:57 crc kubenswrapper[4715]: E1009 07:46:57.136836 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:46:57 crc kubenswrapper[4715]: E1009 07:46:57.137037 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:57 crc kubenswrapper[4715]: E1009 07:46:57.137092 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:57 crc kubenswrapper[4715]: E1009 07:46:57.137133 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.204210 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.204267 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.204280 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.204300 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.204312 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:57Z","lastTransitionTime":"2025-10-09T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.308033 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.308115 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.308139 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.308170 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.308199 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:57Z","lastTransitionTime":"2025-10-09T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.411061 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.411119 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.411131 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.411151 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.411165 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:57Z","lastTransitionTime":"2025-10-09T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.514758 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.514809 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.514820 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.514840 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.514853 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:57Z","lastTransitionTime":"2025-10-09T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.616714 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.616774 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.616795 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.616824 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.616845 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:57Z","lastTransitionTime":"2025-10-09T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.719733 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.719789 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.719799 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.719818 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.719833 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:57Z","lastTransitionTime":"2025-10-09T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.823852 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.823906 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.823916 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.823935 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.823948 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:57Z","lastTransitionTime":"2025-10-09T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.927694 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.927747 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.927759 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.927779 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:57 crc kubenswrapper[4715]: I1009 07:46:57.927794 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:57Z","lastTransitionTime":"2025-10-09T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.031144 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.031212 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.031230 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.031256 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.031275 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:58Z","lastTransitionTime":"2025-10-09T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.134372 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.134477 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.134491 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.134535 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.134550 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:58Z","lastTransitionTime":"2025-10-09T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.237693 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.237781 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.237794 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.237813 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.237828 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:58Z","lastTransitionTime":"2025-10-09T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.340235 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.340298 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.340307 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.340323 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.340333 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:58Z","lastTransitionTime":"2025-10-09T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.443643 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.443717 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.443737 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.443763 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.443785 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:58Z","lastTransitionTime":"2025-10-09T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.545670 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.545732 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.545744 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.545764 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.545777 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:58Z","lastTransitionTime":"2025-10-09T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.648590 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.648655 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.648671 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.648696 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.648713 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:58Z","lastTransitionTime":"2025-10-09T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.752161 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.752227 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.752241 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.752262 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.752277 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:58Z","lastTransitionTime":"2025-10-09T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.855613 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.855672 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.855687 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.855710 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.855724 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:58Z","lastTransitionTime":"2025-10-09T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.958991 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.959073 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.959089 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.959114 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:58 crc kubenswrapper[4715]: I1009 07:46:58.959129 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:58Z","lastTransitionTime":"2025-10-09T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.062893 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.062955 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.062978 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.063007 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.063027 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:59Z","lastTransitionTime":"2025-10-09T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.136330 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.136410 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.136528 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.136656 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:46:59 crc kubenswrapper[4715]: E1009 07:46:59.136868 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:46:59 crc kubenswrapper[4715]: E1009 07:46:59.137040 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:46:59 crc kubenswrapper[4715]: E1009 07:46:59.137237 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:46:59 crc kubenswrapper[4715]: E1009 07:46:59.137309 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.166533 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.166634 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.166663 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.166704 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.166731 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:59Z","lastTransitionTime":"2025-10-09T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.270045 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.270208 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.270234 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.270271 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.270308 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:59Z","lastTransitionTime":"2025-10-09T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.372795 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.372830 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.372839 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.372855 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.372867 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:59Z","lastTransitionTime":"2025-10-09T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.476124 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.476179 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.476189 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.476208 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.476222 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:59Z","lastTransitionTime":"2025-10-09T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.556907 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.569651 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.578146 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.578726 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.578777 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.578803 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.578830 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.578850 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:59Z","lastTransitionTime":"2025-10-09T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.598986 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.620926 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.635859 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.652746 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.671662 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.681480 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.681535 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.681547 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.681567 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.681579 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:59Z","lastTransitionTime":"2025-10-09T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.684896 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.720329 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.744497 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.763134 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.782783 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.784494 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.784550 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.784567 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.784590 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.784610 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:59Z","lastTransitionTime":"2025-10-09T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.803882 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.820181 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.837851 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.853963 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.877710 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.887776 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.887854 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.887878 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.887913 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.887938 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:59Z","lastTransitionTime":"2025-10-09T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.911902 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:42Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 07:46:42.692267 6122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 07:46:42.692108 6122 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1009 07:46:42.692439 6122 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1009 07:46:42.692454 6122 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.506102ms\\\\nI1009 07:46:42.692466 6122 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1009 07:46:42.692467 6122 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nF1009 07:46:42.692492 6122 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:46:59Z is after 2025-08-24T17:21:41Z" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.990844 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.990929 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.990947 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.990973 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:46:59 crc kubenswrapper[4715]: I1009 07:46:59.991009 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:46:59Z","lastTransitionTime":"2025-10-09T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.094728 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.094791 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.094808 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.094833 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.094845 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:00Z","lastTransitionTime":"2025-10-09T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.157568 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.175920 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.198257 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.198333 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.198355 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.198383 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.198401 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:00Z","lastTransitionTime":"2025-10-09T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.200266 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.221710 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.237860 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.255737 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.272697 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.300991 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.301054 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.301067 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.301087 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.301102 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:00Z","lastTransitionTime":"2025-10-09T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.306449 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.329606 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.355395 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.377910 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.401688 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.404118 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.404164 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.404182 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.404205 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.404223 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:00Z","lastTransitionTime":"2025-10-09T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.421586 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.439206 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.456944 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.471804 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.489907 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.506844 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.506909 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.506926 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.506953 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.506973 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:00Z","lastTransitionTime":"2025-10-09T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.515544 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:42Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 07:46:42.692267 6122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 07:46:42.692108 6122 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1009 07:46:42.692439 6122 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1009 07:46:42.692454 6122 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.506102ms\\\\nI1009 07:46:42.692466 6122 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1009 07:46:42.692467 6122 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nF1009 07:46:42.692492 6122 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:00Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.610199 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.610285 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.610318 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.610349 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.610375 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:00Z","lastTransitionTime":"2025-10-09T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.713513 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.713869 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.713976 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.714089 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.714210 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:00Z","lastTransitionTime":"2025-10-09T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.818583 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.818648 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.818661 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.818680 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.818694 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:00Z","lastTransitionTime":"2025-10-09T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.921888 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.921957 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.921980 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.922008 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:00 crc kubenswrapper[4715]: I1009 07:47:00.922032 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:00Z","lastTransitionTime":"2025-10-09T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.025094 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.025156 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.025173 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.025215 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.025234 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:01Z","lastTransitionTime":"2025-10-09T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.127912 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.127964 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.127974 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.127994 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.128007 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:01Z","lastTransitionTime":"2025-10-09T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.136376 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.136384 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.136384 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.136627 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:01 crc kubenswrapper[4715]: E1009 07:47:01.136797 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:01 crc kubenswrapper[4715]: E1009 07:47:01.136987 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:01 crc kubenswrapper[4715]: E1009 07:47:01.137154 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:01 crc kubenswrapper[4715]: E1009 07:47:01.137265 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.231055 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.231126 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.231145 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.231177 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.231198 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:01Z","lastTransitionTime":"2025-10-09T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.334734 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.334802 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.334821 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.334847 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.334870 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:01Z","lastTransitionTime":"2025-10-09T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.395947 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:01 crc kubenswrapper[4715]: E1009 07:47:01.396242 4715 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:47:01 crc kubenswrapper[4715]: E1009 07:47:01.396381 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs podName:9a8fb3b8-b254-4bc3-b105-990eac79c77b nodeName:}" failed. No retries permitted until 2025-10-09 07:47:17.396349703 +0000 UTC m=+68.089153751 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs") pod "network-metrics-daemon-fm6s2" (UID: "9a8fb3b8-b254-4bc3-b105-990eac79c77b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.438777 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.438858 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.438883 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.438912 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.438938 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:01Z","lastTransitionTime":"2025-10-09T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.541404 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.541495 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.541510 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.541534 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.541547 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:01Z","lastTransitionTime":"2025-10-09T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.645362 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.645502 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.645524 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.645549 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.645567 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:01Z","lastTransitionTime":"2025-10-09T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.748714 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.748838 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.748856 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.748881 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.748900 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:01Z","lastTransitionTime":"2025-10-09T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.851760 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.851827 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.851867 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.851899 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.851922 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:01Z","lastTransitionTime":"2025-10-09T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.955839 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.955903 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.955915 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.955939 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:01 crc kubenswrapper[4715]: I1009 07:47:01.955957 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:01Z","lastTransitionTime":"2025-10-09T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.059453 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.059545 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.059570 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.059603 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.059627 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:02Z","lastTransitionTime":"2025-10-09T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.163158 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.163241 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.163257 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.163277 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.163291 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:02Z","lastTransitionTime":"2025-10-09T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.266847 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.266895 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.266907 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.266926 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.266940 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:02Z","lastTransitionTime":"2025-10-09T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.369982 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.370038 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.370051 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.370076 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.370092 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:02Z","lastTransitionTime":"2025-10-09T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.473915 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.473980 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.473999 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.474024 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.474041 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:02Z","lastTransitionTime":"2025-10-09T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.576836 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.577393 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.577576 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.577745 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.577896 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:02Z","lastTransitionTime":"2025-10-09T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.681252 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.681320 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.681337 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.681364 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.681382 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:02Z","lastTransitionTime":"2025-10-09T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.785014 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.785068 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.785085 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.785112 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.785130 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:02Z","lastTransitionTime":"2025-10-09T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.814266 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.814581 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:47:34.814537486 +0000 UTC m=+85.507341534 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.814690 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.814742 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.814870 4715 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.814937 4715 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.814960 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:47:34.814937667 +0000 UTC m=+85.507741695 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.815086 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:47:34.815060651 +0000 UTC m=+85.507864699 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.887929 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.888006 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.888025 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.888050 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.888067 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:02Z","lastTransitionTime":"2025-10-09T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.915672 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.915740 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.915877 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.915899 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.915916 4715 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.915980 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 07:47:34.915963246 +0000 UTC m=+85.608767264 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.915989 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.916064 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.916093 4715 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:47:02 crc kubenswrapper[4715]: E1009 07:47:02.916221 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 07:47:34.916181633 +0000 UTC m=+85.608985681 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.991158 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.991219 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.991233 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.991254 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:02 crc kubenswrapper[4715]: I1009 07:47:02.991273 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:02Z","lastTransitionTime":"2025-10-09T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.094438 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.094485 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.094495 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.094513 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.094525 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:03Z","lastTransitionTime":"2025-10-09T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.136254 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.136254 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.136281 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:03 crc kubenswrapper[4715]: E1009 07:47:03.136684 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:03 crc kubenswrapper[4715]: E1009 07:47:03.136740 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.136437 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:03 crc kubenswrapper[4715]: E1009 07:47:03.136820 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:03 crc kubenswrapper[4715]: E1009 07:47:03.136456 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.196908 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.196960 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.196973 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.196991 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.197005 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:03Z","lastTransitionTime":"2025-10-09T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.300627 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.300679 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.300692 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.300709 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.300723 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:03Z","lastTransitionTime":"2025-10-09T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.403659 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.403722 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.403742 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.403768 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.403785 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:03Z","lastTransitionTime":"2025-10-09T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.506797 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.506870 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.506887 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.506905 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.506917 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:03Z","lastTransitionTime":"2025-10-09T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.609433 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.609475 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.609486 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.609504 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.609518 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:03Z","lastTransitionTime":"2025-10-09T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.712973 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.713035 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.713054 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.713083 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.713102 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:03Z","lastTransitionTime":"2025-10-09T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.815106 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.815138 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.815147 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.815161 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.815171 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:03Z","lastTransitionTime":"2025-10-09T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.917687 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.917787 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.917807 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.917833 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:03 crc kubenswrapper[4715]: I1009 07:47:03.917853 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:03Z","lastTransitionTime":"2025-10-09T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.020237 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.020289 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.020301 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.020319 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.020331 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:04Z","lastTransitionTime":"2025-10-09T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.123616 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.123682 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.123699 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.123724 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.123752 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:04Z","lastTransitionTime":"2025-10-09T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.227345 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.227407 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.227478 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.227504 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.227530 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:04Z","lastTransitionTime":"2025-10-09T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.331094 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.331144 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.331157 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.331179 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.331192 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:04Z","lastTransitionTime":"2025-10-09T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.434230 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.434303 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.434328 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.434361 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.434386 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:04Z","lastTransitionTime":"2025-10-09T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.537854 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.537954 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.537974 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.538001 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.538018 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:04Z","lastTransitionTime":"2025-10-09T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.642038 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.642124 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.642150 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.642175 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.642192 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:04Z","lastTransitionTime":"2025-10-09T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.746627 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.746691 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.746710 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.746736 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.746755 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:04Z","lastTransitionTime":"2025-10-09T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.854216 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.854271 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.854280 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.854297 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.854308 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:04Z","lastTransitionTime":"2025-10-09T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.957328 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.957391 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.957405 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.957461 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:04 crc kubenswrapper[4715]: I1009 07:47:04.957479 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:04Z","lastTransitionTime":"2025-10-09T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.061042 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.061096 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.061110 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.061152 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.061169 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:05Z","lastTransitionTime":"2025-10-09T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.136026 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.136160 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.136037 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.136044 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:05 crc kubenswrapper[4715]: E1009 07:47:05.136313 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:05 crc kubenswrapper[4715]: E1009 07:47:05.136575 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:05 crc kubenswrapper[4715]: E1009 07:47:05.136723 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:05 crc kubenswrapper[4715]: E1009 07:47:05.136844 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.163794 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.163856 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.163874 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.163895 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.163909 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:05Z","lastTransitionTime":"2025-10-09T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.266621 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.266689 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.266713 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.266743 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.266766 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:05Z","lastTransitionTime":"2025-10-09T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.370019 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.370066 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.370077 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.370094 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.370106 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:05Z","lastTransitionTime":"2025-10-09T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.473168 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.473242 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.473259 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.473285 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.473305 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:05Z","lastTransitionTime":"2025-10-09T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.520061 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.541656 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.560195 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.577053 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.577123 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.577142 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.577169 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.577193 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:05Z","lastTransitionTime":"2025-10-09T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.580652 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.599913 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.614998 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.631295 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.644552 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.661226 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.680200 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.680252 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.680262 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.680280 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.680292 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:05Z","lastTransitionTime":"2025-10-09T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.684267 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.705848 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.727143 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.752672 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.774446 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.783392 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.783454 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.783467 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.783486 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.783501 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:05Z","lastTransitionTime":"2025-10-09T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.801094 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.821034 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.838931 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.879039 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.886299 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.886359 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.886377 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.886402 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.886442 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:05Z","lastTransitionTime":"2025-10-09T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.905807 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:42Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 07:46:42.692267 6122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 07:46:42.692108 6122 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1009 07:46:42.692439 6122 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1009 07:46:42.692454 6122 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.506102ms\\\\nI1009 07:46:42.692466 6122 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1009 07:46:42.692467 6122 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nF1009 07:46:42.692492 6122 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:05Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.989632 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.989684 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.989703 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.989730 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:05 crc kubenswrapper[4715]: I1009 07:47:05.989753 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:05Z","lastTransitionTime":"2025-10-09T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.092727 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.092774 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.092783 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.092799 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.092810 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:06Z","lastTransitionTime":"2025-10-09T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.137796 4715 scope.go:117] "RemoveContainer" containerID="6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.203046 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.203466 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.203480 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.203500 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.203515 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:06Z","lastTransitionTime":"2025-10-09T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.306366 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.306483 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.306505 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.306534 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.306557 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:06Z","lastTransitionTime":"2025-10-09T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.408883 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.408917 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.408925 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.408939 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.408947 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:06Z","lastTransitionTime":"2025-10-09T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.511774 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.511822 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.511841 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.511865 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.511885 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:06Z","lastTransitionTime":"2025-10-09T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.535431 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/1.log" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.540163 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da"} Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.540779 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.558301 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.575288 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.596099 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.609319 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.614094 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.614143 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.614157 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.614176 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.614187 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:06Z","lastTransitionTime":"2025-10-09T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.625076 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.639439 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.654895 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.669970 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.683945 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.708845 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.716297 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.716356 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.716370 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.716392 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.716406 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:06Z","lastTransitionTime":"2025-10-09T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.726439 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.742987 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.761656 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.782535 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.797360 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.809637 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.819364 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.819401 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.819410 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.819455 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.819465 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:06Z","lastTransitionTime":"2025-10-09T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.825865 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.847495 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:42Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 07:46:42.692267 6122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 07:46:42.692108 6122 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1009 07:46:42.692439 6122 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1009 07:46:42.692454 6122 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.506102ms\\\\nI1009 07:46:42.692466 6122 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1009 07:46:42.692467 6122 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nF1009 07:46:42.692492 6122 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:47:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:06Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.922566 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.922628 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.922639 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.922661 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:06 crc kubenswrapper[4715]: I1009 07:47:06.922676 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:06Z","lastTransitionTime":"2025-10-09T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.026238 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.026294 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.026304 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.026323 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.026335 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.129724 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.129779 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.129793 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.129813 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.129826 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.136253 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.136304 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:07 crc kubenswrapper[4715]: E1009 07:47:07.136389 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.136253 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.136257 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:07 crc kubenswrapper[4715]: E1009 07:47:07.136574 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:07 crc kubenswrapper[4715]: E1009 07:47:07.136682 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:07 crc kubenswrapper[4715]: E1009 07:47:07.136834 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.230154 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.230199 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.230207 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.230225 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.230237 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: E1009 07:47:07.246880 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.251380 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.251444 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.251466 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.251490 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.251506 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: E1009 07:47:07.267264 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.273619 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.273692 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.273709 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.273734 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.273745 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: E1009 07:47:07.286637 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.292233 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.292288 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.292296 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.292313 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.292324 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: E1009 07:47:07.317298 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.323015 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.323074 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.323090 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.323110 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.323123 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: E1009 07:47:07.337991 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: E1009 07:47:07.338118 4715 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.340508 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.340546 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.340582 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.340603 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.340617 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.444028 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.444084 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.444095 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.444115 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.444127 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.547011 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.547077 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.547094 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.547114 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.547126 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.547249 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/2.log" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.548165 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/1.log" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.552444 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da"} Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.552384 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da" exitCode=1 Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.552535 4715 scope.go:117] "RemoveContainer" containerID="6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.553259 4715 scope.go:117] "RemoveContainer" containerID="9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da" Oct 09 07:47:07 crc kubenswrapper[4715]: E1009 07:47:07.553754 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.573213 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.586378 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.604782 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.628033 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.650382 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.650467 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.650486 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.650512 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.650530 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.661145 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad12f64b1c0fdde9a522e1865b3e364da8fd7260057d3d3077d60cb82b9c258\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:46:42Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1009 07:46:42.692267 6122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1009 07:46:42.692108 6122 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI1009 07:46:42.692439 6122 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/api\\\\\\\"}\\\\nI1009 07:46:42.692454 6122 services_controller.go:360] Finished syncing service api on namespace openshift-apiserver for network=default : 1.506102ms\\\\nI1009 07:46:42.692466 6122 base_network_controller_pods.go:477] [default/openshift-network-console/networking-console-plugin-85b44fc459-gdk6g] creating logical port openshift-network-console_networking-console-plugin-85b44fc459-gdk6g for pod on switch crc\\\\nI1009 07:46:42.692467 6122 services_controller.go:356] Processing sync for service openshift-ingress/router-internal-default for network=default\\\\nF1009 07:46:42.692492 6122 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:06Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 07:47:06.977742 6429 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1009 07:47:06.977779 6429 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1009 07:47:06.977805 6429 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1009 07:47:06.977869 6429 factory.go:1336] Added *v1.Node event handler 7\\\\nI1009 07:47:06.977905 6429 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1009 07:47:06.978255 6429 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1009 07:47:06.978357 6429 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 07:47:06.978395 6429 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07:47:06.978446 6429 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 07:47:06.978526 6429 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:47:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.676302 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.689521 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.709660 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.724482 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.739572 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.753283 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.753314 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.753492 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.753523 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.753545 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.753558 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.773822 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.805897 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.825795 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.844689 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.856910 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.856974 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.856985 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.857005 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.857017 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.863510 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.883774 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.903813 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:07Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.960362 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.960466 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.960485 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.960511 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:07 crc kubenswrapper[4715]: I1009 07:47:07.960530 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:07Z","lastTransitionTime":"2025-10-09T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.063384 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.063487 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.063505 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.063531 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.063549 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:08Z","lastTransitionTime":"2025-10-09T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.166307 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.166393 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.166447 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.166470 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.166489 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:08Z","lastTransitionTime":"2025-10-09T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.269228 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.269292 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.269312 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.269336 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.269354 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:08Z","lastTransitionTime":"2025-10-09T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.372154 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.372188 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.372200 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.372216 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.372228 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:08Z","lastTransitionTime":"2025-10-09T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.475774 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.475850 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.475876 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.475906 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.475939 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:08Z","lastTransitionTime":"2025-10-09T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.560740 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/2.log" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.570865 4715 scope.go:117] "RemoveContainer" containerID="9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da" Oct 09 07:47:08 crc kubenswrapper[4715]: E1009 07:47:08.571299 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.578642 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.578700 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.578719 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.578744 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.578763 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:08Z","lastTransitionTime":"2025-10-09T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.591171 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.611010 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.631840 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.647638 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.662911 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.679056 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.681543 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.681589 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.681606 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.681635 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.681653 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:08Z","lastTransitionTime":"2025-10-09T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.695516 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.713171 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.730254 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.761381 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.777597 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.784276 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.784332 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.784343 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.784360 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.784372 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:08Z","lastTransitionTime":"2025-10-09T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.801643 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.853925 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.875994 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.886755 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.886793 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.886829 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.886850 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.886862 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:08Z","lastTransitionTime":"2025-10-09T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.891668 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.905824 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.925366 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.947146 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:06Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 07:47:06.977742 6429 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1009 07:47:06.977779 6429 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1009 07:47:06.977805 6429 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1009 07:47:06.977869 6429 factory.go:1336] Added *v1.Node event handler 7\\\\nI1009 07:47:06.977905 6429 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1009 07:47:06.978255 6429 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1009 07:47:06.978357 6429 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 07:47:06.978395 6429 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07:47:06.978446 6429 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 07:47:06.978526 6429 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:47:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:08Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.989194 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.989279 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.989289 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.989308 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:08 crc kubenswrapper[4715]: I1009 07:47:08.989319 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:08Z","lastTransitionTime":"2025-10-09T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.092335 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.092401 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.092449 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.092477 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.092504 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:09Z","lastTransitionTime":"2025-10-09T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.136314 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.136360 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.136554 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.136639 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:09 crc kubenswrapper[4715]: E1009 07:47:09.136733 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:09 crc kubenswrapper[4715]: E1009 07:47:09.136813 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:09 crc kubenswrapper[4715]: E1009 07:47:09.136947 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:09 crc kubenswrapper[4715]: E1009 07:47:09.137086 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.195873 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.195951 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.195972 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.195997 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.196016 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:09Z","lastTransitionTime":"2025-10-09T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.298794 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.298863 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.298887 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.298922 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.298943 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:09Z","lastTransitionTime":"2025-10-09T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.401596 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.401669 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.401686 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.401711 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.401728 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:09Z","lastTransitionTime":"2025-10-09T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.504604 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.504667 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.504688 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.504714 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.504733 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:09Z","lastTransitionTime":"2025-10-09T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.607760 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.607833 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.607856 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.607885 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.607904 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:09Z","lastTransitionTime":"2025-10-09T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.711179 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.711245 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.711263 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.711283 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.711295 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:09Z","lastTransitionTime":"2025-10-09T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.814275 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.814324 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.814340 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.814359 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.814371 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:09Z","lastTransitionTime":"2025-10-09T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.918042 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.918128 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.918149 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.918175 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:09 crc kubenswrapper[4715]: I1009 07:47:09.918193 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:09Z","lastTransitionTime":"2025-10-09T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.021102 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.021171 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.021189 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.021214 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.021229 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:10Z","lastTransitionTime":"2025-10-09T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.123547 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.123802 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.123907 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.123933 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.123949 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:10Z","lastTransitionTime":"2025-10-09T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.153793 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.169470 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.181858 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.192861 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.221971 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.226489 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.226528 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.226537 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.226553 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.226563 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:10Z","lastTransitionTime":"2025-10-09T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.237414 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.259486 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.289007 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:06Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 07:47:06.977742 6429 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1009 07:47:06.977779 6429 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1009 07:47:06.977805 6429 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1009 07:47:06.977869 6429 factory.go:1336] Added *v1.Node event handler 7\\\\nI1009 07:47:06.977905 6429 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1009 07:47:06.978255 6429 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1009 07:47:06.978357 6429 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 07:47:06.978395 6429 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07:47:06.978446 6429 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 07:47:06.978526 6429 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:47:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.306695 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.323566 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.329465 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.329510 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.329528 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.329554 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.329571 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:10Z","lastTransitionTime":"2025-10-09T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.341095 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.356510 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.377843 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.399087 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.413551 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.427195 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.432697 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.432745 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.432757 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.432773 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.432785 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:10Z","lastTransitionTime":"2025-10-09T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.439071 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.453982 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:10Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.535339 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.535408 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.535460 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.535494 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.535515 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:10Z","lastTransitionTime":"2025-10-09T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.639754 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.640155 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.640214 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.640248 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.640271 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:10Z","lastTransitionTime":"2025-10-09T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.743298 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.743371 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.743393 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.743458 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.743486 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:10Z","lastTransitionTime":"2025-10-09T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.846825 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.846882 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.846892 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.846908 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.846922 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:10Z","lastTransitionTime":"2025-10-09T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.950325 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.950355 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.950364 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.950377 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:10 crc kubenswrapper[4715]: I1009 07:47:10.950386 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:10Z","lastTransitionTime":"2025-10-09T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.054318 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.054376 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.054386 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.054405 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.054445 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:11Z","lastTransitionTime":"2025-10-09T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.136487 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.136574 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:11 crc kubenswrapper[4715]: E1009 07:47:11.136669 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.136492 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.136855 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:11 crc kubenswrapper[4715]: E1009 07:47:11.137072 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:11 crc kubenswrapper[4715]: E1009 07:47:11.137233 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:11 crc kubenswrapper[4715]: E1009 07:47:11.137365 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.157120 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.157191 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.157212 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.157246 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.157270 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:11Z","lastTransitionTime":"2025-10-09T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.260744 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.260803 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.260819 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.260841 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.260854 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:11Z","lastTransitionTime":"2025-10-09T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.364950 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.365033 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.365050 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.365076 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.365093 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:11Z","lastTransitionTime":"2025-10-09T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.468558 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.468657 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.468683 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.468722 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.468745 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:11Z","lastTransitionTime":"2025-10-09T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.571354 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.571413 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.571453 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.571473 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.571485 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:11Z","lastTransitionTime":"2025-10-09T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.674284 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.674380 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.674407 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.674473 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.674496 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:11Z","lastTransitionTime":"2025-10-09T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.777272 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.777324 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.777345 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.777374 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.777397 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:11Z","lastTransitionTime":"2025-10-09T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.880641 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.880705 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.880722 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.880748 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.880772 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:11Z","lastTransitionTime":"2025-10-09T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.983892 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.983955 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.983968 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.983989 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:11 crc kubenswrapper[4715]: I1009 07:47:11.984005 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:11Z","lastTransitionTime":"2025-10-09T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.086635 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.086696 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.086712 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.086736 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.086751 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:12Z","lastTransitionTime":"2025-10-09T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.194645 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.194705 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.194718 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.194737 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.194751 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:12Z","lastTransitionTime":"2025-10-09T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.298415 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.298521 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.298542 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.298570 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.298589 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:12Z","lastTransitionTime":"2025-10-09T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.402015 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.402087 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.402109 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.402145 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.402167 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:12Z","lastTransitionTime":"2025-10-09T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.505296 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.505358 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.505375 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.505400 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.505444 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:12Z","lastTransitionTime":"2025-10-09T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.608503 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.608541 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.608550 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.608565 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.608576 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:12Z","lastTransitionTime":"2025-10-09T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.712140 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.712208 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.712226 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.712749 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.712819 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:12Z","lastTransitionTime":"2025-10-09T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.816709 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.816755 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.816770 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.816791 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.816805 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:12Z","lastTransitionTime":"2025-10-09T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.920768 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.920814 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.920829 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.920848 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:12 crc kubenswrapper[4715]: I1009 07:47:12.920863 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:12Z","lastTransitionTime":"2025-10-09T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.024855 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.024925 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.024947 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.024977 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.025000 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:13Z","lastTransitionTime":"2025-10-09T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.128162 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.128197 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.128206 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.128222 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.128231 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:13Z","lastTransitionTime":"2025-10-09T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.135866 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.135906 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.135938 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.135942 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:13 crc kubenswrapper[4715]: E1009 07:47:13.136040 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:13 crc kubenswrapper[4715]: E1009 07:47:13.136149 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:13 crc kubenswrapper[4715]: E1009 07:47:13.136265 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:13 crc kubenswrapper[4715]: E1009 07:47:13.136358 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.231890 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.231952 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.231973 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.231998 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.232015 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:13Z","lastTransitionTime":"2025-10-09T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.334720 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.334809 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.334833 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.334876 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.334924 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:13Z","lastTransitionTime":"2025-10-09T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.437929 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.438006 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.438024 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.438050 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.438068 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:13Z","lastTransitionTime":"2025-10-09T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.541442 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.541491 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.541500 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.541518 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.541530 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:13Z","lastTransitionTime":"2025-10-09T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.643812 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.643868 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.643880 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.643896 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.643908 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:13Z","lastTransitionTime":"2025-10-09T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.746590 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.746673 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.746684 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.746708 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.746722 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:13Z","lastTransitionTime":"2025-10-09T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.849159 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.849213 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.849230 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.849251 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.849266 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:13Z","lastTransitionTime":"2025-10-09T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.952840 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.952885 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.952898 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.952915 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:13 crc kubenswrapper[4715]: I1009 07:47:13.952928 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:13Z","lastTransitionTime":"2025-10-09T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.055977 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.056029 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.056046 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.056071 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.056088 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:14Z","lastTransitionTime":"2025-10-09T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.158117 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.158157 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.158171 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.158188 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.158201 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:14Z","lastTransitionTime":"2025-10-09T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.261094 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.261136 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.261147 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.261166 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.261177 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:14Z","lastTransitionTime":"2025-10-09T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.364930 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.364988 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.365005 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.365030 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.365045 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:14Z","lastTransitionTime":"2025-10-09T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.467678 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.467736 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.467752 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.467776 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.467792 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:14Z","lastTransitionTime":"2025-10-09T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.570587 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.570642 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.570653 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.570672 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.570684 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:14Z","lastTransitionTime":"2025-10-09T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.673456 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.673801 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.673875 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.673945 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.674027 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:14Z","lastTransitionTime":"2025-10-09T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.776601 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.776663 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.776676 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.776696 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.776715 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:14Z","lastTransitionTime":"2025-10-09T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.879395 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.879468 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.879480 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.879501 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.879515 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:14Z","lastTransitionTime":"2025-10-09T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.982550 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.982611 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.982622 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.982642 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:14 crc kubenswrapper[4715]: I1009 07:47:14.982656 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:14Z","lastTransitionTime":"2025-10-09T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.085287 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.085345 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.085357 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.085381 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.085397 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:15Z","lastTransitionTime":"2025-10-09T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.136844 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.136972 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.137022 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.137000 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:15 crc kubenswrapper[4715]: E1009 07:47:15.137243 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:15 crc kubenswrapper[4715]: E1009 07:47:15.137388 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:15 crc kubenswrapper[4715]: E1009 07:47:15.137532 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:15 crc kubenswrapper[4715]: E1009 07:47:15.137635 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.188612 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.188684 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.188707 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.188733 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.188753 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:15Z","lastTransitionTime":"2025-10-09T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.292483 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.292576 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.292603 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.292637 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.292662 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:15Z","lastTransitionTime":"2025-10-09T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.396411 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.396489 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.396505 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.396526 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.396542 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:15Z","lastTransitionTime":"2025-10-09T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.499532 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.499604 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.499618 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.499636 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.499674 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:15Z","lastTransitionTime":"2025-10-09T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.602865 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.602993 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.603010 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.603032 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.603046 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:15Z","lastTransitionTime":"2025-10-09T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.706476 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.706530 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.706547 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.706571 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.706588 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:15Z","lastTransitionTime":"2025-10-09T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.809881 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.809922 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.809931 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.809948 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.809959 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:15Z","lastTransitionTime":"2025-10-09T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.913041 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.913091 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.913103 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.913123 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:15 crc kubenswrapper[4715]: I1009 07:47:15.913135 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:15Z","lastTransitionTime":"2025-10-09T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.015867 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.015927 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.015950 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.015980 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.016004 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:16Z","lastTransitionTime":"2025-10-09T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.119126 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.119161 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.119172 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.119205 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.119216 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:16Z","lastTransitionTime":"2025-10-09T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.151735 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.220864 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.220896 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.220906 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.220921 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.220931 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:16Z","lastTransitionTime":"2025-10-09T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.323690 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.323731 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.323743 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.323760 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.323771 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:16Z","lastTransitionTime":"2025-10-09T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.425961 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.426266 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.426356 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.426479 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.426591 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:16Z","lastTransitionTime":"2025-10-09T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.529399 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.530091 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.530133 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.530163 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.530194 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:16Z","lastTransitionTime":"2025-10-09T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.632510 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.632557 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.632571 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.632598 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.632632 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:16Z","lastTransitionTime":"2025-10-09T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.735531 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.735599 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.735618 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.735644 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.735664 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:16Z","lastTransitionTime":"2025-10-09T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.838592 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.838663 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.838686 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.838718 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.838744 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:16Z","lastTransitionTime":"2025-10-09T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.942078 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.942163 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.942188 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.942219 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:16 crc kubenswrapper[4715]: I1009 07:47:16.942242 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:16Z","lastTransitionTime":"2025-10-09T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.045112 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.045178 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.045188 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.045208 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.045223 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.136706 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.136749 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.136780 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.136798 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:17 crc kubenswrapper[4715]: E1009 07:47:17.136913 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:17 crc kubenswrapper[4715]: E1009 07:47:17.136994 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:17 crc kubenswrapper[4715]: E1009 07:47:17.137180 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:17 crc kubenswrapper[4715]: E1009 07:47:17.137316 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.147383 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.147412 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.147441 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.147454 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.147464 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.250526 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.250573 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.250582 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.250599 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.250610 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.353620 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.353671 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.353684 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.353702 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.353714 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.423792 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:17 crc kubenswrapper[4715]: E1009 07:47:17.423956 4715 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:47:17 crc kubenswrapper[4715]: E1009 07:47:17.424061 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs podName:9a8fb3b8-b254-4bc3-b105-990eac79c77b nodeName:}" failed. No retries permitted until 2025-10-09 07:47:49.424035924 +0000 UTC m=+100.116839952 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs") pod "network-metrics-daemon-fm6s2" (UID: "9a8fb3b8-b254-4bc3-b105-990eac79c77b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.456986 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.457030 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.457042 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.457061 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.457073 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.541634 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.541689 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.541706 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.541730 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.541750 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: E1009 07:47:17.557300 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:17Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.561524 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.561575 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.561587 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.561603 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.561613 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: E1009 07:47:17.579814 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:17Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.584685 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.584750 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.584768 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.584793 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.584811 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: E1009 07:47:17.602887 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:17Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.608778 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.608831 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.608847 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.608869 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.608887 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: E1009 07:47:17.623350 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:17Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.627492 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.627532 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.627543 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.627562 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.627575 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: E1009 07:47:17.640030 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:17Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:17 crc kubenswrapper[4715]: E1009 07:47:17.640263 4715 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.642623 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.642657 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.642666 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.642681 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.642691 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.745843 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.745894 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.745905 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.745925 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.745937 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.848784 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.848831 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.848839 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.848856 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.848867 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.951925 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.951979 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.951989 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.952008 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:17 crc kubenswrapper[4715]: I1009 07:47:17.952020 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:17Z","lastTransitionTime":"2025-10-09T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.054541 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.054579 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.054589 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.054604 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.054615 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:18Z","lastTransitionTime":"2025-10-09T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.156399 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.156452 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.156462 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.156477 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.156488 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:18Z","lastTransitionTime":"2025-10-09T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.259799 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.259863 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.259883 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.259913 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.259936 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:18Z","lastTransitionTime":"2025-10-09T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.362491 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.362559 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.362576 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.362602 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.362614 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:18Z","lastTransitionTime":"2025-10-09T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.465778 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.465818 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.465827 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.465844 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.465855 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:18Z","lastTransitionTime":"2025-10-09T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.570130 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.570196 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.570208 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.570227 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.570241 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:18Z","lastTransitionTime":"2025-10-09T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.673623 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.673667 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.673677 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.673694 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.673710 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:18Z","lastTransitionTime":"2025-10-09T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.776433 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.776478 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.776490 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.776510 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.776524 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:18Z","lastTransitionTime":"2025-10-09T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.879341 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.879414 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.879465 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.879492 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.879511 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:18Z","lastTransitionTime":"2025-10-09T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.983242 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.983287 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.983295 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.983312 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:18 crc kubenswrapper[4715]: I1009 07:47:18.983321 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:18Z","lastTransitionTime":"2025-10-09T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.085999 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.086069 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.086091 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.086124 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.086146 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:19Z","lastTransitionTime":"2025-10-09T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.135960 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.136017 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.136068 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.135997 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:19 crc kubenswrapper[4715]: E1009 07:47:19.136214 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:19 crc kubenswrapper[4715]: E1009 07:47:19.136398 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:19 crc kubenswrapper[4715]: E1009 07:47:19.136576 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:19 crc kubenswrapper[4715]: E1009 07:47:19.136713 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.189454 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.189528 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.189548 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.189577 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.189596 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:19Z","lastTransitionTime":"2025-10-09T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.292233 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.292297 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.292309 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.292330 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.292342 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:19Z","lastTransitionTime":"2025-10-09T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.395595 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.395663 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.395682 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.395708 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.395727 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:19Z","lastTransitionTime":"2025-10-09T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.498854 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.498914 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.498927 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.498946 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.498961 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:19Z","lastTransitionTime":"2025-10-09T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.601477 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.601536 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.601548 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.601567 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.601579 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:19Z","lastTransitionTime":"2025-10-09T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.612389 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6vp75_6e61f2cb-cd6d-46d6-bbb6-dd99919b893d/kube-multus/0.log" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.612473 4715 generic.go:334] "Generic (PLEG): container finished" podID="6e61f2cb-cd6d-46d6-bbb6-dd99919b893d" containerID="d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9" exitCode=1 Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.612532 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6vp75" event={"ID":"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d","Type":"ContainerDied","Data":"d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9"} Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.613268 4715 scope.go:117] "RemoveContainer" containerID="d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.630381 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.650136 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.671453 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.693898 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:19Z\\\",\\\"message\\\":\\\"2025-10-09T07:46:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65\\\\n2025-10-09T07:46:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65 to /host/opt/cni/bin/\\\\n2025-10-09T07:46:34Z [verbose] multus-daemon started\\\\n2025-10-09T07:46:34Z [verbose] Readiness Indicator file check\\\\n2025-10-09T07:47:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.704087 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.704210 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.704276 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.704354 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.704440 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:19Z","lastTransitionTime":"2025-10-09T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.710621 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.724384 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.739317 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.755022 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.774563 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.792167 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.805900 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f493763d-a027-430f-b652-84331bf8aa43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14e99604cffc35c659058d1363536aa5d067bbbb1c29b2b366c6aa8c1ed6bb72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.807496 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.807554 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.807572 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.807596 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.807613 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:19Z","lastTransitionTime":"2025-10-09T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.827672 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.845031 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.858993 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.885346 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:06Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 07:47:06.977742 6429 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1009 07:47:06.977779 6429 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1009 07:47:06.977805 6429 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1009 07:47:06.977869 6429 factory.go:1336] Added *v1.Node event handler 7\\\\nI1009 07:47:06.977905 6429 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1009 07:47:06.978255 6429 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1009 07:47:06.978357 6429 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 07:47:06.978395 6429 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07:47:06.978446 6429 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 07:47:06.978526 6429 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:47:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.898250 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.910837 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.910884 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.910896 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.910918 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.910930 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:19Z","lastTransitionTime":"2025-10-09T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.911225 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.928359 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:19 crc kubenswrapper[4715]: I1009 07:47:19.947144 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:19Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.013146 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.013214 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.013224 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.013243 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.013254 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:20Z","lastTransitionTime":"2025-10-09T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.116201 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.116241 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.116249 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.116262 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.116271 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:20Z","lastTransitionTime":"2025-10-09T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.137475 4715 scope.go:117] "RemoveContainer" containerID="9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da" Oct 09 07:47:20 crc kubenswrapper[4715]: E1009 07:47:20.137643 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.148259 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.164928 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.177674 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.195629 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.212942 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.217956 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.217995 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.218010 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.218031 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.218044 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:20Z","lastTransitionTime":"2025-10-09T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.229858 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.243543 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.260908 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f493763d-a027-430f-b652-84331bf8aa43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14e99604cffc35c659058d1363536aa5d067bbbb1c29b2b366c6aa8c1ed6bb72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.293660 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.312444 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.320463 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.320535 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.320552 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.320572 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.320585 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:20Z","lastTransitionTime":"2025-10-09T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.330595 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.355007 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:06Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 07:47:06.977742 6429 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1009 07:47:06.977779 6429 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1009 07:47:06.977805 6429 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1009 07:47:06.977869 6429 factory.go:1336] Added *v1.Node event handler 7\\\\nI1009 07:47:06.977905 6429 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1009 07:47:06.978255 6429 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1009 07:47:06.978357 6429 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 07:47:06.978395 6429 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07:47:06.978446 6429 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 07:47:06.978526 6429 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:47:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.373318 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.388724 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.404347 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:19Z\\\",\\\"message\\\":\\\"2025-10-09T07:46:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65\\\\n2025-10-09T07:46:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65 to /host/opt/cni/bin/\\\\n2025-10-09T07:46:34Z [verbose] multus-daemon started\\\\n2025-10-09T07:46:34Z [verbose] Readiness Indicator file check\\\\n2025-10-09T07:47:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.414073 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.423242 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.423281 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.423294 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.423312 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.423322 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:20Z","lastTransitionTime":"2025-10-09T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.423944 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.434980 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.446783 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.526372 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.526449 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.526469 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.526493 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.526510 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:20Z","lastTransitionTime":"2025-10-09T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.619574 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6vp75_6e61f2cb-cd6d-46d6-bbb6-dd99919b893d/kube-multus/0.log" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.619659 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6vp75" event={"ID":"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d","Type":"ContainerStarted","Data":"4e02a5b9a040e142c2a3f8ca488f0de0e42b0e01fff8a9987ea1ee5c354b1e31"} Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.629114 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.629185 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.629209 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.629238 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.629261 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:20Z","lastTransitionTime":"2025-10-09T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.640898 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.660991 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.687404 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.710739 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:06Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 07:47:06.977742 6429 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1009 07:47:06.977779 6429 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1009 07:47:06.977805 6429 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1009 07:47:06.977869 6429 factory.go:1336] Added *v1.Node event handler 7\\\\nI1009 07:47:06.977905 6429 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1009 07:47:06.978255 6429 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1009 07:47:06.978357 6429 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 07:47:06.978395 6429 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07:47:06.978446 6429 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 07:47:06.978526 6429 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:47:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.728370 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.732040 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.732067 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.732075 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.732090 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.732100 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:20Z","lastTransitionTime":"2025-10-09T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.743384 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.760865 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e02a5b9a040e142c2a3f8ca488f0de0e42b0e01fff8a9987ea1ee5c354b1e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:19Z\\\",\\\"message\\\":\\\"2025-10-09T07:46:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65\\\\n2025-10-09T07:46:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65 to /host/opt/cni/bin/\\\\n2025-10-09T07:46:34Z [verbose] multus-daemon started\\\\n2025-10-09T07:46:34Z [verbose] Readiness Indicator file check\\\\n2025-10-09T07:47:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.771333 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.781145 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.793779 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.809724 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.820577 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.834578 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.834607 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.834616 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.834629 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.834657 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:20Z","lastTransitionTime":"2025-10-09T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.841631 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.855903 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.869709 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.884774 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.896235 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.906405 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.916618 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f493763d-a027-430f-b652-84331bf8aa43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14e99604cffc35c659058d1363536aa5d067bbbb1c29b2b366c6aa8c1ed6bb72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:20Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.950257 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.950306 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.950321 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.950340 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:20 crc kubenswrapper[4715]: I1009 07:47:20.950359 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:20Z","lastTransitionTime":"2025-10-09T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.054544 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.054612 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.054630 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.054655 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.054673 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:21Z","lastTransitionTime":"2025-10-09T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.136582 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.136670 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.136682 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.136583 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:21 crc kubenswrapper[4715]: E1009 07:47:21.136789 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:21 crc kubenswrapper[4715]: E1009 07:47:21.136932 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:21 crc kubenswrapper[4715]: E1009 07:47:21.137088 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:21 crc kubenswrapper[4715]: E1009 07:47:21.137208 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.156917 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.157000 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.157015 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.157030 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.157040 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:21Z","lastTransitionTime":"2025-10-09T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.259569 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.259615 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.259625 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.259640 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.259651 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:21Z","lastTransitionTime":"2025-10-09T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.362552 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.362607 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.362620 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.362640 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.362656 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:21Z","lastTransitionTime":"2025-10-09T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.466275 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.466340 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.466356 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.466375 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.466391 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:21Z","lastTransitionTime":"2025-10-09T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.569078 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.569134 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.569151 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.569222 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.569250 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:21Z","lastTransitionTime":"2025-10-09T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.672700 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.672750 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.672759 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.672777 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.672788 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:21Z","lastTransitionTime":"2025-10-09T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.776474 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.776527 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.776539 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.776558 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.776569 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:21Z","lastTransitionTime":"2025-10-09T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.879578 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.879635 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.879645 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.879665 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.879678 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:21Z","lastTransitionTime":"2025-10-09T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.982487 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.982537 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.982584 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.982608 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:21 crc kubenswrapper[4715]: I1009 07:47:21.983139 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:21Z","lastTransitionTime":"2025-10-09T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.085712 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.085748 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.085758 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.085774 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.085784 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:22Z","lastTransitionTime":"2025-10-09T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.187990 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.188029 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.188039 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.188055 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.188066 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:22Z","lastTransitionTime":"2025-10-09T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.290723 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.290776 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.290794 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.290819 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.290837 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:22Z","lastTransitionTime":"2025-10-09T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.393397 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.393451 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.393460 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.393477 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.393486 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:22Z","lastTransitionTime":"2025-10-09T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.496332 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.496378 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.496387 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.496403 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.496437 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:22Z","lastTransitionTime":"2025-10-09T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.599480 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.599543 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.599552 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.599572 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.599585 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:22Z","lastTransitionTime":"2025-10-09T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.701648 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.701702 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.701715 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.701737 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.701754 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:22Z","lastTransitionTime":"2025-10-09T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.803979 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.804022 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.804034 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.804053 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.804070 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:22Z","lastTransitionTime":"2025-10-09T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.906349 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.906409 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.906484 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.906512 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:22 crc kubenswrapper[4715]: I1009 07:47:22.906541 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:22Z","lastTransitionTime":"2025-10-09T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.009536 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.009575 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.009595 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.009617 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.009627 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:23Z","lastTransitionTime":"2025-10-09T07:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.111881 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.111918 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.111927 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.111960 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.111971 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:23Z","lastTransitionTime":"2025-10-09T07:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.136215 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.136246 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.136262 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:23 crc kubenswrapper[4715]: E1009 07:47:23.136387 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.136439 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:23 crc kubenswrapper[4715]: E1009 07:47:23.136578 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:23 crc kubenswrapper[4715]: E1009 07:47:23.136687 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:23 crc kubenswrapper[4715]: E1009 07:47:23.136787 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.214132 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.214231 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.214248 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.214273 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.214292 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:23Z","lastTransitionTime":"2025-10-09T07:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.317103 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.317184 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.317202 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.317709 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.317741 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:23Z","lastTransitionTime":"2025-10-09T07:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.420304 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.420367 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.420384 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.420408 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.420457 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:23Z","lastTransitionTime":"2025-10-09T07:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.523212 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.523249 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.523257 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.523271 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.523281 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:23Z","lastTransitionTime":"2025-10-09T07:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.625691 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.625735 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.625757 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.625788 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.625817 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:23Z","lastTransitionTime":"2025-10-09T07:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.728056 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.728112 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.728125 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.728147 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.728159 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:23Z","lastTransitionTime":"2025-10-09T07:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.836509 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.836557 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.836570 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.836598 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.836612 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:23Z","lastTransitionTime":"2025-10-09T07:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.939839 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.939901 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.939917 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.939942 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:23 crc kubenswrapper[4715]: I1009 07:47:23.939959 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:23Z","lastTransitionTime":"2025-10-09T07:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.042706 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.042760 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.042777 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.042801 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.042819 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:24Z","lastTransitionTime":"2025-10-09T07:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.144952 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.145000 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.145011 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.145027 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.145038 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:24Z","lastTransitionTime":"2025-10-09T07:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.248067 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.248173 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.248194 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.248227 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.248253 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:24Z","lastTransitionTime":"2025-10-09T07:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.351973 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.352045 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.352065 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.352090 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.352113 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:24Z","lastTransitionTime":"2025-10-09T07:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.455311 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.455359 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.455370 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.455391 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.455408 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:24Z","lastTransitionTime":"2025-10-09T07:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.557742 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.557806 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.557822 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.557843 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.557857 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:24Z","lastTransitionTime":"2025-10-09T07:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.660939 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.661014 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.661038 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.661064 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.661082 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:24Z","lastTransitionTime":"2025-10-09T07:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.763853 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.763962 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.763976 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.764042 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.764060 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:24Z","lastTransitionTime":"2025-10-09T07:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.866890 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.866940 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.866952 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.866974 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.866989 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:24Z","lastTransitionTime":"2025-10-09T07:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.970310 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.970363 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.970377 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.970394 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:24 crc kubenswrapper[4715]: I1009 07:47:24.970407 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:24Z","lastTransitionTime":"2025-10-09T07:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.074053 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.074112 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.074126 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.074146 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.074162 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:25Z","lastTransitionTime":"2025-10-09T07:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.136934 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.137062 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.137126 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:25 crc kubenswrapper[4715]: E1009 07:47:25.137319 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.137350 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:25 crc kubenswrapper[4715]: E1009 07:47:25.137524 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:25 crc kubenswrapper[4715]: E1009 07:47:25.137599 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:25 crc kubenswrapper[4715]: E1009 07:47:25.137697 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.177304 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.177370 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.177390 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.177449 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.177469 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:25Z","lastTransitionTime":"2025-10-09T07:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.280053 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.280112 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.280125 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.280148 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.280165 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:25Z","lastTransitionTime":"2025-10-09T07:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.383083 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.383141 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.383154 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.383176 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.383190 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:25Z","lastTransitionTime":"2025-10-09T07:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.485935 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.485989 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.486005 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.486031 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.486048 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:25Z","lastTransitionTime":"2025-10-09T07:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.589384 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.589549 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.589581 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.589616 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.589639 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:25Z","lastTransitionTime":"2025-10-09T07:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.692185 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.692270 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.692302 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.692334 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.692358 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:25Z","lastTransitionTime":"2025-10-09T07:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.795747 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.795841 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.795881 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.795917 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.795941 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:25Z","lastTransitionTime":"2025-10-09T07:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.899465 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.899526 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.899537 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.899585 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:25 crc kubenswrapper[4715]: I1009 07:47:25.899599 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:25Z","lastTransitionTime":"2025-10-09T07:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.002789 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.002854 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.002870 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.002894 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.002912 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:26Z","lastTransitionTime":"2025-10-09T07:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.106692 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.106761 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.106785 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.106815 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.106838 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:26Z","lastTransitionTime":"2025-10-09T07:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.210543 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.210599 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.210617 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.210636 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.210648 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:26Z","lastTransitionTime":"2025-10-09T07:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.314798 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.314875 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.314898 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.314926 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.314951 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:26Z","lastTransitionTime":"2025-10-09T07:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.418707 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.418758 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.418774 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.418799 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.418816 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:26Z","lastTransitionTime":"2025-10-09T07:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.522923 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.522982 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.523004 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.523032 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.523053 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:26Z","lastTransitionTime":"2025-10-09T07:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.625890 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.625958 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.625975 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.626003 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.626020 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:26Z","lastTransitionTime":"2025-10-09T07:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.729443 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.729820 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.729909 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.730017 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.730113 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:26Z","lastTransitionTime":"2025-10-09T07:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.834097 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.834877 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.834917 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.834956 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.834977 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:26Z","lastTransitionTime":"2025-10-09T07:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.939102 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.939148 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.939157 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.939172 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:26 crc kubenswrapper[4715]: I1009 07:47:26.939182 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:26Z","lastTransitionTime":"2025-10-09T07:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.042385 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.042452 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.042463 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.042479 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.042490 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.136717 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.136733 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.136731 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.136753 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:27 crc kubenswrapper[4715]: E1009 07:47:27.137398 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:27 crc kubenswrapper[4715]: E1009 07:47:27.136881 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:27 crc kubenswrapper[4715]: E1009 07:47:27.137157 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:27 crc kubenswrapper[4715]: E1009 07:47:27.137606 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.145071 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.145300 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.145496 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.145645 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.145781 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.249317 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.249395 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.249456 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.249489 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.249512 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.352659 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.352717 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.352734 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.352758 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.352776 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.455237 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.455309 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.455320 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.455334 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.455343 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.558150 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.558193 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.558202 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.558219 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.558230 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.661175 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.661226 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.661240 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.661262 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.661279 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.684070 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.684125 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.684138 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.684158 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.684177 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: E1009 07:47:27.699315 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:27Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.704962 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.705012 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.705023 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.705041 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.705058 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: E1009 07:47:27.727849 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:27Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.733499 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.733579 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.733596 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.733618 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.733634 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: E1009 07:47:27.746976 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:27Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.751249 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.751300 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.751312 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.751330 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.751347 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: E1009 07:47:27.770132 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:27Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.773861 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.773908 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.773917 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.773935 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.773946 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: E1009 07:47:27.793342 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:27Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:27 crc kubenswrapper[4715]: E1009 07:47:27.793497 4715 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.795569 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.795601 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.795610 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.795630 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.795642 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.898924 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.898967 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.898976 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.898994 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:27 crc kubenswrapper[4715]: I1009 07:47:27.899010 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:27Z","lastTransitionTime":"2025-10-09T07:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.002104 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.002172 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.002185 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.002213 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.002226 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:28Z","lastTransitionTime":"2025-10-09T07:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.105236 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.105327 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.105337 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.105359 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.105371 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:28Z","lastTransitionTime":"2025-10-09T07:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.208026 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.208080 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.208095 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.208117 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.208131 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:28Z","lastTransitionTime":"2025-10-09T07:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.310652 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.310709 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.310725 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.310747 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.310761 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:28Z","lastTransitionTime":"2025-10-09T07:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.413715 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.413772 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.413790 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.413813 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.413830 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:28Z","lastTransitionTime":"2025-10-09T07:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.516503 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.516567 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.516579 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.516605 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.516619 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:28Z","lastTransitionTime":"2025-10-09T07:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.619636 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.619691 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.619709 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.619732 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.619749 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:28Z","lastTransitionTime":"2025-10-09T07:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.722890 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.723063 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.723083 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.723101 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.723115 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:28Z","lastTransitionTime":"2025-10-09T07:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.826675 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.826747 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.826759 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.826780 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.826793 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:28Z","lastTransitionTime":"2025-10-09T07:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.929591 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.929661 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.929680 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.930151 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:28 crc kubenswrapper[4715]: I1009 07:47:28.930212 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:28Z","lastTransitionTime":"2025-10-09T07:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.034061 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.034107 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.034126 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.034150 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.034167 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:29Z","lastTransitionTime":"2025-10-09T07:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.135902 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.135939 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:29 crc kubenswrapper[4715]: E1009 07:47:29.136026 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.136060 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.136194 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:29 crc kubenswrapper[4715]: E1009 07:47:29.136286 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:29 crc kubenswrapper[4715]: E1009 07:47:29.136410 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:29 crc kubenswrapper[4715]: E1009 07:47:29.136494 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.136532 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.136604 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.136630 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.136658 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.136683 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:29Z","lastTransitionTime":"2025-10-09T07:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.239997 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.240078 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.240102 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.240131 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.240151 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:29Z","lastTransitionTime":"2025-10-09T07:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.342547 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.342624 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.342643 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.342671 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.342690 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:29Z","lastTransitionTime":"2025-10-09T07:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.445590 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.445655 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.445673 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.445700 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.445719 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:29Z","lastTransitionTime":"2025-10-09T07:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.548857 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.548937 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.548956 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.548983 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.549004 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:29Z","lastTransitionTime":"2025-10-09T07:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.651747 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.651799 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.651816 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.651838 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.651856 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:29Z","lastTransitionTime":"2025-10-09T07:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.754656 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.754746 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.754765 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.754801 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.754828 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:29Z","lastTransitionTime":"2025-10-09T07:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.857642 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.857712 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.857731 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.857758 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.857778 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:29Z","lastTransitionTime":"2025-10-09T07:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.960869 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.960927 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.960945 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.960971 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:29 crc kubenswrapper[4715]: I1009 07:47:29.960990 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:29Z","lastTransitionTime":"2025-10-09T07:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.063885 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.063986 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.064005 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.064038 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.064055 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:30Z","lastTransitionTime":"2025-10-09T07:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.159185 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.179205 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.179282 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.179306 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.179336 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.179358 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:30Z","lastTransitionTime":"2025-10-09T07:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.186101 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.211325 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.234523 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:06Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 07:47:06.977742 6429 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1009 07:47:06.977779 6429 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1009 07:47:06.977805 6429 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1009 07:47:06.977869 6429 factory.go:1336] Added *v1.Node event handler 7\\\\nI1009 07:47:06.977905 6429 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1009 07:47:06.978255 6429 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1009 07:47:06.978357 6429 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 07:47:06.978395 6429 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07:47:06.978446 6429 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 07:47:06.978526 6429 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:47:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.249136 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.261355 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.278332 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e02a5b9a040e142c2a3f8ca488f0de0e42b0e01fff8a9987ea1ee5c354b1e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:19Z\\\",\\\"message\\\":\\\"2025-10-09T07:46:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65\\\\n2025-10-09T07:46:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65 to /host/opt/cni/bin/\\\\n2025-10-09T07:46:34Z [verbose] multus-daemon started\\\\n2025-10-09T07:46:34Z [verbose] Readiness Indicator file check\\\\n2025-10-09T07:47:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.282019 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.282054 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.282066 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.282087 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.282100 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:30Z","lastTransitionTime":"2025-10-09T07:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.291121 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.308985 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.327337 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.341138 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.354280 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.377781 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.384985 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.385010 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.385022 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.385040 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.385051 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:30Z","lastTransitionTime":"2025-10-09T07:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.394388 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.406169 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.417336 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.429983 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.442537 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.454016 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f493763d-a027-430f-b652-84331bf8aa43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14e99604cffc35c659058d1363536aa5d067bbbb1c29b2b366c6aa8c1ed6bb72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:30Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.487599 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.487663 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.487681 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.487706 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.487726 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:30Z","lastTransitionTime":"2025-10-09T07:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.590814 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.590933 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.590957 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.590993 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.591014 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:30Z","lastTransitionTime":"2025-10-09T07:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.693961 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.694083 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.694109 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.694145 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.694175 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:30Z","lastTransitionTime":"2025-10-09T07:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.797078 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.797190 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.797209 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.797233 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.797250 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:30Z","lastTransitionTime":"2025-10-09T07:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.899551 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.899615 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.899633 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.899665 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:30 crc kubenswrapper[4715]: I1009 07:47:30.899691 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:30Z","lastTransitionTime":"2025-10-09T07:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.002644 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.002703 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.002723 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.002749 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.002768 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:31Z","lastTransitionTime":"2025-10-09T07:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.105531 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.105621 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.105694 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.105727 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.105750 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:31Z","lastTransitionTime":"2025-10-09T07:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.136047 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.136174 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.136278 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:31 crc kubenswrapper[4715]: E1009 07:47:31.136278 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.136400 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:31 crc kubenswrapper[4715]: E1009 07:47:31.136670 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:31 crc kubenswrapper[4715]: E1009 07:47:31.136777 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:31 crc kubenswrapper[4715]: E1009 07:47:31.136935 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.253845 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.253897 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.253909 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.253927 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.253942 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:31Z","lastTransitionTime":"2025-10-09T07:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.356110 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.356144 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.356154 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.356170 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.356181 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:31Z","lastTransitionTime":"2025-10-09T07:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.459336 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.459403 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.459460 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.459494 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.459513 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:31Z","lastTransitionTime":"2025-10-09T07:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.563274 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.563386 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.563444 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.563473 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.563491 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:31Z","lastTransitionTime":"2025-10-09T07:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.666055 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.666097 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.666106 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.666119 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.666129 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:31Z","lastTransitionTime":"2025-10-09T07:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.769717 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.769771 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.769782 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.769802 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.769815 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:31Z","lastTransitionTime":"2025-10-09T07:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.873217 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.873271 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.873322 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.873347 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.873365 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:31Z","lastTransitionTime":"2025-10-09T07:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.976382 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.976482 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.976500 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.976525 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:31 crc kubenswrapper[4715]: I1009 07:47:31.976542 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:31Z","lastTransitionTime":"2025-10-09T07:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.079848 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.079908 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.079924 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.079949 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.079968 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:32Z","lastTransitionTime":"2025-10-09T07:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.182874 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.182954 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.182979 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.183009 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.183031 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:32Z","lastTransitionTime":"2025-10-09T07:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.285927 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.286002 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.286019 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.286045 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.286062 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:32Z","lastTransitionTime":"2025-10-09T07:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.389582 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.389657 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.389676 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.389700 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.389718 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:32Z","lastTransitionTime":"2025-10-09T07:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.492306 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.492341 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.492349 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.492363 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.492373 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:32Z","lastTransitionTime":"2025-10-09T07:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.595534 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.595583 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.595593 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.595614 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.595624 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:32Z","lastTransitionTime":"2025-10-09T07:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.698844 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.698899 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.698916 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.698939 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.698957 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:32Z","lastTransitionTime":"2025-10-09T07:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.801642 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.801708 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.801727 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.801756 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.801777 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:32Z","lastTransitionTime":"2025-10-09T07:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.905078 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.905141 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.905158 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.905183 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:32 crc kubenswrapper[4715]: I1009 07:47:32.905201 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:32Z","lastTransitionTime":"2025-10-09T07:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.009362 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.009482 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.009502 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.009528 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.009548 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:33Z","lastTransitionTime":"2025-10-09T07:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.113118 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.113179 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.113197 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.113223 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.113240 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:33Z","lastTransitionTime":"2025-10-09T07:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.136575 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:33 crc kubenswrapper[4715]: E1009 07:47:33.136777 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.137473 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.137512 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:33 crc kubenswrapper[4715]: E1009 07:47:33.137573 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.137721 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:33 crc kubenswrapper[4715]: E1009 07:47:33.137782 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:33 crc kubenswrapper[4715]: E1009 07:47:33.137939 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.138212 4715 scope.go:117] "RemoveContainer" containerID="9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.216516 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.216548 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.216559 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.216573 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.216583 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:33Z","lastTransitionTime":"2025-10-09T07:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.320860 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.321173 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.321182 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.321198 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.321211 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:33Z","lastTransitionTime":"2025-10-09T07:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.424096 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.424126 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.424135 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.424149 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.424157 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:33Z","lastTransitionTime":"2025-10-09T07:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.555006 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.555064 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.555078 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.555097 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.555113 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:33Z","lastTransitionTime":"2025-10-09T07:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.658326 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.658386 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.658398 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.658436 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.658450 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:33Z","lastTransitionTime":"2025-10-09T07:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.667836 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/2.log" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.671715 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c"} Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.761682 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.761734 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.761749 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.761768 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.761781 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:33Z","lastTransitionTime":"2025-10-09T07:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.866388 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.866458 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.866488 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.866513 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.866530 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:33Z","lastTransitionTime":"2025-10-09T07:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.969776 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.969835 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.969845 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.969866 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:33 crc kubenswrapper[4715]: I1009 07:47:33.969880 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:33Z","lastTransitionTime":"2025-10-09T07:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.072960 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.073013 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.073025 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.073043 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.073059 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:34Z","lastTransitionTime":"2025-10-09T07:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.175851 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.176476 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.176599 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.176840 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.177054 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:34Z","lastTransitionTime":"2025-10-09T07:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.279193 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.279236 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.279250 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.279269 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.279282 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:34Z","lastTransitionTime":"2025-10-09T07:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.381996 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.382040 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.382051 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.382067 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.382082 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:34Z","lastTransitionTime":"2025-10-09T07:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.485202 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.485249 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.485262 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.485280 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.485293 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:34Z","lastTransitionTime":"2025-10-09T07:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.587495 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.587539 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.587549 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.587570 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.587580 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:34Z","lastTransitionTime":"2025-10-09T07:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.677917 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/3.log" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.679040 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/2.log" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.682675 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" exitCode=1 Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.682720 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c"} Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.682777 4715 scope.go:117] "RemoveContainer" containerID="9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.683845 4715 scope.go:117] "RemoveContainer" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.684198 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.689261 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.689292 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.689302 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.689318 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.689330 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:34Z","lastTransitionTime":"2025-10-09T07:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.701634 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.716277 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.738849 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.755697 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.777383 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.791759 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.791997 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.792150 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.792275 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.792444 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:34Z","lastTransitionTime":"2025-10-09T07:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.797455 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.818368 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.838486 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.856212 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f493763d-a027-430f-b652-84331bf8aa43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14e99604cffc35c659058d1363536aa5d067bbbb1c29b2b366c6aa8c1ed6bb72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.874929 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.888569 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.888708 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.888682892 +0000 UTC m=+149.581486940 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.889265 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.889492 4715 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.890063 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.890041941 +0000 UTC m=+149.582845989 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.890237 4715 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.889572 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.890515 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.89036746 +0000 UTC m=+149.583171498 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.895659 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.895695 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.895706 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.895724 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.895736 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:34Z","lastTransitionTime":"2025-10-09T07:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.899935 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.921838 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.953503 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e5a47778ea88a7073dbc4a69df923fbb9b8c8e887a2ec5220cd3618633da7da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:06Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1009 07:47:06.977742 6429 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1009 07:47:06.977779 6429 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1009 07:47:06.977805 6429 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1009 07:47:06.977869 6429 factory.go:1336] Added *v1.Node event handler 7\\\\nI1009 07:47:06.977905 6429 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1009 07:47:06.978255 6429 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1009 07:47:06.978357 6429 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 07:47:06.978395 6429 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07:47:06.978446 6429 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 07:47:06.978526 6429 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:47:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:34Z\\\",\\\"message\\\":\\\"ent handler 4 for removal\\\\nI1009 07:47:34.558951 6790 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 07:47:34.558965 6790 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 07:47:34.558973 6790 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 07:47:34.559045 6790 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 07:47:34.559058 6790 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 07:47:34.559066 6790 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 07:47:34.559098 6790 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 07:47:34.559117 6790 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 07:47:34.559264 6790 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1009 07:47:34.559356 6790 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 07:47:34.559391 6790 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1009 07:47:34.559407 6790 factory.go:656] Stopping watch factory\\\\nI1009 07:47:34.559432 6790 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07:47:34.559460 6790 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 07:47:34.559473 6790 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 07:47:34.559555 6790 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.971655 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.989525 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:34Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.991994 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.992068 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.992264 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.992307 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.992326 4715 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.992402 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.992380091 +0000 UTC m=+149.685184129 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.992641 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.992663 4715 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.992697 4715 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:47:34 crc kubenswrapper[4715]: E1009 07:47:34.992738 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.992725471 +0000 UTC m=+149.685529479 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.998533 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.998561 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.998570 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.998585 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:34 crc kubenswrapper[4715]: I1009 07:47:34.998595 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:34Z","lastTransitionTime":"2025-10-09T07:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.011106 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e02a5b9a040e142c2a3f8ca488f0de0e42b0e01fff8a9987ea1ee5c354b1e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:19Z\\\",\\\"message\\\":\\\"2025-10-09T07:46:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65\\\\n2025-10-09T07:46:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65 to /host/opt/cni/bin/\\\\n2025-10-09T07:46:34Z [verbose] multus-daemon started\\\\n2025-10-09T07:46:34Z [verbose] Readiness Indicator file check\\\\n2025-10-09T07:47:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.027406 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.043559 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.060808 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.101162 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.101227 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.101247 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.101273 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.101291 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:35Z","lastTransitionTime":"2025-10-09T07:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.135942 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.135999 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.135957 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:35 crc kubenswrapper[4715]: E1009 07:47:35.136124 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.135943 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:35 crc kubenswrapper[4715]: E1009 07:47:35.136310 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:35 crc kubenswrapper[4715]: E1009 07:47:35.136364 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:35 crc kubenswrapper[4715]: E1009 07:47:35.136443 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.204410 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.204515 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.204534 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.204560 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.204579 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:35Z","lastTransitionTime":"2025-10-09T07:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.307820 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.307861 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.307873 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.307903 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.307917 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:35Z","lastTransitionTime":"2025-10-09T07:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.410819 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.410876 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.410895 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.410919 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.410940 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:35Z","lastTransitionTime":"2025-10-09T07:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.513306 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.513356 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.513373 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.513394 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.513409 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:35Z","lastTransitionTime":"2025-10-09T07:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.616837 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.616907 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.616924 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.616949 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.616971 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:35Z","lastTransitionTime":"2025-10-09T07:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.689365 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/3.log" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.695789 4715 scope.go:117] "RemoveContainer" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" Oct 09 07:47:35 crc kubenswrapper[4715]: E1009 07:47:35.696188 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.714922 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.721162 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.721267 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.721293 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.721320 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.721338 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:35Z","lastTransitionTime":"2025-10-09T07:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.739378 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.757785 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.775343 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.789654 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.804019 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f493763d-a027-430f-b652-84331bf8aa43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14e99604cffc35c659058d1363536aa5d067bbbb1c29b2b366c6aa8c1ed6bb72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.823840 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.824342 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.824353 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.824376 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.824389 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:35Z","lastTransitionTime":"2025-10-09T07:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.834572 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.850071 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.866167 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.885855 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:34Z\\\",\\\"message\\\":\\\"ent handler 4 for removal\\\\nI1009 07:47:34.558951 6790 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 07:47:34.558965 6790 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 07:47:34.558973 6790 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 07:47:34.559045 6790 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 07:47:34.559058 6790 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 07:47:34.559066 6790 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 07:47:34.559098 6790 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 07:47:34.559117 6790 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 07:47:34.559264 6790 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1009 07:47:34.559356 6790 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 07:47:34.559391 6790 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1009 07:47:34.559407 6790 factory.go:656] Stopping watch factory\\\\nI1009 07:47:34.559432 6790 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07:47:34.559460 6790 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 07:47:34.559473 6790 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 07:47:34.559555 6790 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:47:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.902622 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.917071 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.927490 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.927530 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.927545 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.927567 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.927581 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:35Z","lastTransitionTime":"2025-10-09T07:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.934022 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.953852 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.968536 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.981179 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:35 crc kubenswrapper[4715]: I1009 07:47:35.994960 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:35Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.008021 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e02a5b9a040e142c2a3f8ca488f0de0e42b0e01fff8a9987ea1ee5c354b1e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:19Z\\\",\\\"message\\\":\\\"2025-10-09T07:46:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65\\\\n2025-10-09T07:46:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65 to /host/opt/cni/bin/\\\\n2025-10-09T07:46:34Z [verbose] multus-daemon started\\\\n2025-10-09T07:46:34Z [verbose] Readiness Indicator file check\\\\n2025-10-09T07:47:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.017849 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:36Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.029920 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.029982 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.029995 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.030011 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.030021 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:36Z","lastTransitionTime":"2025-10-09T07:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.133201 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.133250 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.133260 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.133279 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.133295 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:36Z","lastTransitionTime":"2025-10-09T07:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.236235 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.236281 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.236295 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.236313 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.236326 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:36Z","lastTransitionTime":"2025-10-09T07:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.338859 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.338902 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.338912 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.338927 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.338936 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:36Z","lastTransitionTime":"2025-10-09T07:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.441364 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.441651 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.441790 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.441879 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.441954 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:36Z","lastTransitionTime":"2025-10-09T07:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.544527 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.544827 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.544888 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.544955 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.545026 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:36Z","lastTransitionTime":"2025-10-09T07:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.652703 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.652806 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.652834 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.652884 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.652914 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:36Z","lastTransitionTime":"2025-10-09T07:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.756390 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.756498 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.756512 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.756535 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.756552 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:36Z","lastTransitionTime":"2025-10-09T07:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.860170 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.860244 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.860270 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.860303 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.860328 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:36Z","lastTransitionTime":"2025-10-09T07:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.963283 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.963345 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.963367 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.963396 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:36 crc kubenswrapper[4715]: I1009 07:47:36.963415 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:36Z","lastTransitionTime":"2025-10-09T07:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.066164 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.066240 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.066264 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.066290 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.066311 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:37Z","lastTransitionTime":"2025-10-09T07:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.136237 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.136345 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.136411 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:37 crc kubenswrapper[4715]: E1009 07:47:37.136905 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:37 crc kubenswrapper[4715]: E1009 07:47:37.136737 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.136462 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:37 crc kubenswrapper[4715]: E1009 07:47:37.137037 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:37 crc kubenswrapper[4715]: E1009 07:47:37.137124 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.170091 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.170177 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.170205 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.170274 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.170298 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:37Z","lastTransitionTime":"2025-10-09T07:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.273298 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.273357 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.273372 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.273391 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.273402 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:37Z","lastTransitionTime":"2025-10-09T07:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.377260 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.377312 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.377323 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.377340 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.377350 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:37Z","lastTransitionTime":"2025-10-09T07:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.479780 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.479860 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.479883 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.479912 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.479931 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:37Z","lastTransitionTime":"2025-10-09T07:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.583696 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.583764 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.583795 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.583840 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.583881 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:37Z","lastTransitionTime":"2025-10-09T07:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.686403 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.686499 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.686522 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.686599 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.686625 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:37Z","lastTransitionTime":"2025-10-09T07:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.789705 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.789743 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.789757 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.789773 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.789785 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:37Z","lastTransitionTime":"2025-10-09T07:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.892864 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.892965 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.892993 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.893026 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.893048 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:37Z","lastTransitionTime":"2025-10-09T07:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.996788 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.996850 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.996872 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.996902 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:37 crc kubenswrapper[4715]: I1009 07:47:37.996922 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:37Z","lastTransitionTime":"2025-10-09T07:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.100555 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.100618 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.100653 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.100686 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.100709 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.181005 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.181087 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.181105 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.181131 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.181150 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: E1009 07:47:38.203140 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.208240 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.208301 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.208324 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.208354 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.208379 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: E1009 07:47:38.228962 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.235024 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.235104 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.235123 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.235153 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.235172 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: E1009 07:47:38.254987 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.259988 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.260044 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.260070 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.260099 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.260122 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: E1009 07:47:38.280167 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.285365 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.285466 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.285495 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.285520 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.285537 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: E1009 07:47:38.305272 4715 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"88c6bc2d-8227-4dff-bf57-494ec73b39f9\\\",\\\"systemUUID\\\":\\\"25873b5a-8b59-46be-9c14-6241a2c78490\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:38Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:38 crc kubenswrapper[4715]: E1009 07:47:38.305549 4715 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.307688 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.307763 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.307784 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.307817 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.307839 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.411465 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.411537 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.411558 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.411585 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.411603 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.516087 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.516167 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.516191 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.516224 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.516248 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.619630 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.619688 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.619707 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.619732 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.619748 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.723352 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.723412 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.723468 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.723494 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.723525 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.826953 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.827017 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.827035 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.827063 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.827081 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.930229 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.930278 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.930293 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.930316 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:38 crc kubenswrapper[4715]: I1009 07:47:38.930332 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:38Z","lastTransitionTime":"2025-10-09T07:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.034010 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.034069 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.034092 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.034124 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.034145 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:39Z","lastTransitionTime":"2025-10-09T07:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.136236 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.136318 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.136661 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.136855 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.136971 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.137111 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.137132 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.137156 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.137173 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:39Z","lastTransitionTime":"2025-10-09T07:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:39 crc kubenswrapper[4715]: E1009 07:47:39.137258 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:39 crc kubenswrapper[4715]: E1009 07:47:39.137174 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:39 crc kubenswrapper[4715]: E1009 07:47:39.137455 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:39 crc kubenswrapper[4715]: E1009 07:47:39.137570 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.240725 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.240824 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.240838 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.240870 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.240885 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:39Z","lastTransitionTime":"2025-10-09T07:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.344376 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.344677 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.344695 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.344718 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.344735 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:39Z","lastTransitionTime":"2025-10-09T07:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.448233 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.448688 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.448889 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.449281 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.449585 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:39Z","lastTransitionTime":"2025-10-09T07:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.552675 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.552985 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.553095 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.553206 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.553304 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:39Z","lastTransitionTime":"2025-10-09T07:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.655095 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.655152 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.655162 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.655175 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.655185 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:39Z","lastTransitionTime":"2025-10-09T07:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.758468 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.758524 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.758545 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.758575 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.758599 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:39Z","lastTransitionTime":"2025-10-09T07:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.861706 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.861769 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.861786 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.861810 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.861828 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:39Z","lastTransitionTime":"2025-10-09T07:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.964944 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.965323 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.965632 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.965819 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:39 crc kubenswrapper[4715]: I1009 07:47:39.966150 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:39Z","lastTransitionTime":"2025-10-09T07:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.069656 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.069699 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.069712 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.069730 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.069742 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:40Z","lastTransitionTime":"2025-10-09T07:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.158549 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"601e8bbc-736f-4fd6-a5db-acf0c0680140\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7b63b0a66948efeeb8afe2b17b5e2461b54aa7fcbd7eea11181fd3e077f878e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dabdb403ae3d1cad8d766a205299375905e6851f89a3022ec1468ba6ad7f463\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c29c93e1f45a5b0592ac77d5f064cff563130da8019669a013ad65026ca46474\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cb5215e2ab354a950cbd77ed11f48001aee890b171fd4f3ee9823f5fa4dcf37\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.172285 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.172516 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.172605 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.172690 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.172765 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:40Z","lastTransitionTime":"2025-10-09T07:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.176890 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://770b320ad49f63618e01bc73df4df10cb694b01d658727bb395ff59e6a609442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://958c52c695933700cd3b19f8c6539c5566827f57a22ed1fea9b6326e2261f673\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.200264 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76f34f31-285e-4f90-954d-888a59ad6080\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4996d81a0257313b571696eae1c0c7a590b2282472852505b7f60ab07ae4e7fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d94e6a6be6039fd4dc91a2cdad7e4171bffc8983844bdc3d3d012748057f0ba2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30935ac65beb91f804e67c2b92cb4862167c813e64d849714febf64981918a49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3df4472c90e8a28c1cffc90c2c6e9e5de09c43fcadb507f0d75dfe0c446c11d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d905dda0751bb6f7bb3618877970c6a467c9786188e74806dcbf701fc510e35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b73deb7a8f938d21e695c8dfeb855eb833459cb65948c3d001e4d3ced9dd2a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e705d1622a4dc500b5dd22241c1a68a4e50bf52fd124e3d2675a1007b9f6c51c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v2szk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8gf4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.233849 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1d6cb14a-7329-4a80-aff2-acd9142558d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:34Z\\\",\\\"message\\\":\\\"ent handler 4 for removal\\\\nI1009 07:47:34.558951 6790 handler.go:208] Removed *v1.Node event handler 7\\\\nI1009 07:47:34.558965 6790 handler.go:208] Removed *v1.Node event handler 2\\\\nI1009 07:47:34.558973 6790 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1009 07:47:34.559045 6790 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1009 07:47:34.559058 6790 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1009 07:47:34.559066 6790 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1009 07:47:34.559098 6790 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1009 07:47:34.559117 6790 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1009 07:47:34.559264 6790 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1009 07:47:34.559356 6790 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1009 07:47:34.559391 6790 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1009 07:47:34.559407 6790 factory.go:656] Stopping watch factory\\\\nI1009 07:47:34.559432 6790 ovnkube.go:599] Stopped ovnkube\\\\nI1009 07:47:34.559460 6790 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1009 07:47:34.559473 6790 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1009 07:47:34.559555 6790 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:47:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4tdb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z9ztn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.252713 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a1a53d8-70da-4f6d-b92f-801a563952ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19465e3367078df139314e3b29a1b05d15c7ab22cb681c92e2a0394aaaaf887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b8a525d8b7ec3e08d688a4f5419e937a01e5dfa1de58caa9e3fad5ee5ed593f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff8906a42b46d23c122035098bfd88203a6418fe2e0ef806e7babbc9670e2c89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d17b0d82be9febaeb884dea2cfb61c5f189c0fce2aff03c02bbf020d89828f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.271293 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8875bf33dca9b2d1d7bf66aaeb2fa239b455ea46d1e6790a9f6e1c5c2da2ec6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.275753 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.275818 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.275842 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.275867 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.275885 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:40Z","lastTransitionTime":"2025-10-09T07:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.290407 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6vp75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e02a5b9a040e142c2a3f8ca488f0de0e42b0e01fff8a9987ea1ee5c354b1e31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-09T07:47:19Z\\\",\\\"message\\\":\\\"2025-10-09T07:46:33+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65\\\\n2025-10-09T07:46:33+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_987c0460-fa68-459c-8f62-182794f36a65 to /host/opt/cni/bin/\\\\n2025-10-09T07:46:34Z [verbose] multus-daemon started\\\\n2025-10-09T07:46:34Z [verbose] Readiness Indicator file check\\\\n2025-10-09T07:47:19Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:47:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zz46q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6vp75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.306135 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pqt86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c54c0f2-0671-4f29-a4b8-7ea32758200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a835e316a2f8a0cc8bf44d5edd66b376fd20a6f7bf6a467a611e04e5fcc9993f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkfzn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:33Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pqt86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.319017 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8fb3b8-b254-4bc3-b105-990eac79c77b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pbsl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fm6s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.339253 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.352639 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd48d949-08f9-4a54-ae1c-fe0cfbbcf08f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2978fac0aaadeb9ab4b6ecfc9249a28d011c2f6fe50e3528e008e08df338f304\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3111e48e9ab42467dbae06523e433e0f52ace4f6552d43674fa52010d57b409\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-97crn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ksbvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.361848 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5tfxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a186a549-1c86-4777-97e8-04df48fad842\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1312ab6651462ae52831c89894987a598b1623159dddca34a4848dfbc86191ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mdktp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5tfxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.379037 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.379290 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.379462 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.379580 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.379681 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:40Z","lastTransitionTime":"2025-10-09T07:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.384605 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8095fd96-32bb-459e-b524-6cf679b95b21\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc320b6b98a82e720d488ce9958599e2f732919ac43ccb3834e5dd90042077ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7424a86e3801e7aea51cf175c8cbb65ae15a4df07426022cf9e4ba6b82c13924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://149ab2506eb7fd28879c9734c5189259cde574afb0a4f7708b0b84c5a514c996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a96e0c2dc207504189aac5f2822e4fc8fdc58a19388a3d081553ecec07f03bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0bc91552a8f6c9f83684aa851ef1b07fa4562c736427c3264762f4486b65c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3efd24f11c6d069843a8e55d0207e8d884f8f3a7da23fb09059aab53f5934e30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ec2c7781a79d9b4e99e58b0468e6c206a40d7dd6e2a37fc6fc4c2b9b6cd367\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cb250058ad2a49694caa51721205de9f006db1d712c1c9677765f9ac94ae97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.405524 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94f4f451-5ba1-439c-9987-d2d8d37129e6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:47:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ab9492d73e1ced7e8b9dcfbf64ede97fb7c53def5e290efe2320d37d5f8a3f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e94dc3b7cc39c67b95708f5a4b7d2bcf103c565c5c868684fa838816e882c720\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86bd2df729ce7029714c942828cff7e13c738eb5d918fc7dfdefe16e5420fc98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f2c6cc41c3fcb7aa04475aef503dfa481735d7d591632251226133ffa9cfec9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9ae88746f64c0ccb8588c68463485f58618e793a118d15891fa8c061d631028\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-09T07:46:26Z\\\",\\\"message\\\":\\\"le observer\\\\nW1009 07:46:26.195650 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1009 07:46:26.195886 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1009 07:46:26.197650 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1771579011/tls.crt::/tmp/serving-cert-1771579011/tls.key\\\\\\\"\\\\nI1009 07:46:26.707018 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1009 07:46:26.710937 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1009 07:46:26.710964 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1009 07:46:26.710986 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1009 07:46:26.710992 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1009 07:46:26.721297 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1009 07:46:26.721350 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721363 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1009 07:46:26.721375 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1009 07:46:26.721386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1009 07:46:26.721377 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1009 07:46:26.721396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1009 07:46:26.721462 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1009 07:46:26.723740 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14232d9805b9847774597840c84b29709285393122781fe95af059e50c285ec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e671fbfaaba7821dbb52ac67d4ef95f9fb16a355d3cfece6bae7f7121c5c5ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.416646 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d1166d9eb763c499c126069c02d693a608549e5cbb8d4862551b7555100324b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.425987 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.435916 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.446093 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acafd807-8875-4b4f-aba9-4f807ca336e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1277c6a868bcd62e2cfc7dda77ccba4f206f4216eec40ceb53ed8c09aebd5eaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w6mp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7vwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.456312 4715 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f493763d-a027-430f-b652-84331bf8aa43\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-09T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14e99604cffc35c659058d1363536aa5d067bbbb1c29b2b366c6aa8c1ed6bb72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-09T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4a006c2f6cf15ff04cabddf2c3b0707b29cc3552afa5abd3f9647ef06567695\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-09T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-09T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-09T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-09T07:47:40Z is after 2025-08-24T17:21:41Z" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.483659 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.484031 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.484216 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.484389 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.484702 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:40Z","lastTransitionTime":"2025-10-09T07:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.587318 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.587375 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.587393 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.587554 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.587577 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:40Z","lastTransitionTime":"2025-10-09T07:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.691191 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.691636 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.691853 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.692119 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.692320 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:40Z","lastTransitionTime":"2025-10-09T07:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.795262 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.795336 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.795357 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.795383 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.795404 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:40Z","lastTransitionTime":"2025-10-09T07:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.898319 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.898394 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.898413 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.898473 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:40 crc kubenswrapper[4715]: I1009 07:47:40.898492 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:40Z","lastTransitionTime":"2025-10-09T07:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.001073 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.001116 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.001129 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.001149 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.001161 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:41Z","lastTransitionTime":"2025-10-09T07:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.104752 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.104832 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.104856 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.104888 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.104915 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:41Z","lastTransitionTime":"2025-10-09T07:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.136173 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.136173 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.136297 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:41 crc kubenswrapper[4715]: E1009 07:47:41.136442 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.136466 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:41 crc kubenswrapper[4715]: E1009 07:47:41.136625 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:41 crc kubenswrapper[4715]: E1009 07:47:41.136721 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:41 crc kubenswrapper[4715]: E1009 07:47:41.136770 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.207809 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.207854 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.207870 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.207886 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.207897 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:41Z","lastTransitionTime":"2025-10-09T07:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.311189 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.311264 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.311291 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.311322 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.311344 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:41Z","lastTransitionTime":"2025-10-09T07:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.413802 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.413866 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.413883 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.413910 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.413940 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:41Z","lastTransitionTime":"2025-10-09T07:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.516495 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.516564 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.516586 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.516620 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.516643 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:41Z","lastTransitionTime":"2025-10-09T07:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.620029 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.620094 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.620111 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.620137 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.620155 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:41Z","lastTransitionTime":"2025-10-09T07:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.723264 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.723332 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.723356 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.723386 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.723407 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:41Z","lastTransitionTime":"2025-10-09T07:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.827297 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.827383 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.827407 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.827472 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.827503 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:41Z","lastTransitionTime":"2025-10-09T07:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.930739 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.930772 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.930779 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.930792 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:41 crc kubenswrapper[4715]: I1009 07:47:41.930801 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:41Z","lastTransitionTime":"2025-10-09T07:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.034087 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.034129 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.034140 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.034161 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.034177 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:42Z","lastTransitionTime":"2025-10-09T07:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.137693 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.137738 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.137746 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.137761 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.137771 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:42Z","lastTransitionTime":"2025-10-09T07:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.240870 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.240926 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.240938 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.240958 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.240970 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:42Z","lastTransitionTime":"2025-10-09T07:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.344079 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.344171 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.344197 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.344232 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.344258 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:42Z","lastTransitionTime":"2025-10-09T07:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.447115 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.447941 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.448073 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.448213 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.448404 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:42Z","lastTransitionTime":"2025-10-09T07:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.551088 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.551141 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.551157 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.551177 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.551191 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:42Z","lastTransitionTime":"2025-10-09T07:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.654933 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.655374 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.655590 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.655820 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.655992 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:42Z","lastTransitionTime":"2025-10-09T07:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.758649 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.759044 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.759238 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.759414 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.759613 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:42Z","lastTransitionTime":"2025-10-09T07:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.862067 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.862475 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.862623 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.862758 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.862898 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:42Z","lastTransitionTime":"2025-10-09T07:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.965496 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.965534 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.965544 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.965561 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:42 crc kubenswrapper[4715]: I1009 07:47:42.965573 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:42Z","lastTransitionTime":"2025-10-09T07:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.069586 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.069647 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.069665 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.069692 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.069710 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:43Z","lastTransitionTime":"2025-10-09T07:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.136406 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.136406 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.136453 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.136549 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:43 crc kubenswrapper[4715]: E1009 07:47:43.136952 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:43 crc kubenswrapper[4715]: E1009 07:47:43.137142 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:43 crc kubenswrapper[4715]: E1009 07:47:43.137194 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:43 crc kubenswrapper[4715]: E1009 07:47:43.137273 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.172466 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.172495 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.172504 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.172522 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.172532 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:43Z","lastTransitionTime":"2025-10-09T07:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.274755 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.274798 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.274807 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.274823 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.274836 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:43Z","lastTransitionTime":"2025-10-09T07:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.378084 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.378143 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.378161 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.378185 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.378203 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:43Z","lastTransitionTime":"2025-10-09T07:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.481705 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.481777 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.481796 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.481823 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.481840 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:43Z","lastTransitionTime":"2025-10-09T07:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.585309 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.585377 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.585396 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.585460 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.585490 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:43Z","lastTransitionTime":"2025-10-09T07:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.688188 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.688603 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.688911 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.689202 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.689387 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:43Z","lastTransitionTime":"2025-10-09T07:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.792342 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.792799 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.792979 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.793157 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.793317 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:43Z","lastTransitionTime":"2025-10-09T07:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.896783 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.896858 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.896883 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.896918 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:43 crc kubenswrapper[4715]: I1009 07:47:43.896941 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:43Z","lastTransitionTime":"2025-10-09T07:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:43.999945 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.000026 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.000043 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.000068 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.000085 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:44Z","lastTransitionTime":"2025-10-09T07:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.103191 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.103557 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.103673 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.103790 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.103878 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:44Z","lastTransitionTime":"2025-10-09T07:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.206218 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.206255 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.206266 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.206281 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.206291 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:44Z","lastTransitionTime":"2025-10-09T07:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.309238 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.309288 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.309305 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.309328 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.309346 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:44Z","lastTransitionTime":"2025-10-09T07:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.412658 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.413019 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.413248 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.413535 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.413757 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:44Z","lastTransitionTime":"2025-10-09T07:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.519280 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.519386 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.519406 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.519470 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.519490 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:44Z","lastTransitionTime":"2025-10-09T07:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.622753 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.622852 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.622880 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.622912 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.622932 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:44Z","lastTransitionTime":"2025-10-09T07:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.725287 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.725857 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.726085 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.726293 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.726651 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:44Z","lastTransitionTime":"2025-10-09T07:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.830356 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.830901 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.831069 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.831283 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.831476 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:44Z","lastTransitionTime":"2025-10-09T07:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.934922 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.934960 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.934994 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.935013 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:44 crc kubenswrapper[4715]: I1009 07:47:44.935025 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:44Z","lastTransitionTime":"2025-10-09T07:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.037609 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.037675 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.037688 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.037710 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.037724 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:45Z","lastTransitionTime":"2025-10-09T07:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.135970 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.136038 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.136056 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.135970 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:45 crc kubenswrapper[4715]: E1009 07:47:45.136176 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:45 crc kubenswrapper[4715]: E1009 07:47:45.136320 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:45 crc kubenswrapper[4715]: E1009 07:47:45.136487 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:45 crc kubenswrapper[4715]: E1009 07:47:45.136599 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.140067 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.140108 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.140124 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.140149 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.140166 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:45Z","lastTransitionTime":"2025-10-09T07:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.244217 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.244292 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.244314 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.244343 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.244364 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:45Z","lastTransitionTime":"2025-10-09T07:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.347335 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.347399 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.347447 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.347473 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.347492 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:45Z","lastTransitionTime":"2025-10-09T07:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.450724 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.451154 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.451385 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.451658 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.451875 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:45Z","lastTransitionTime":"2025-10-09T07:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.555576 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.555639 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.555654 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.555680 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.555696 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:45Z","lastTransitionTime":"2025-10-09T07:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.659858 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.659912 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.659923 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.659943 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.659956 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:45Z","lastTransitionTime":"2025-10-09T07:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.763169 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.763245 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.763268 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.763297 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.763320 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:45Z","lastTransitionTime":"2025-10-09T07:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.866389 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.866474 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.866498 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.866528 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.866769 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:45Z","lastTransitionTime":"2025-10-09T07:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.969845 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.970138 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.970227 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.970324 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:45 crc kubenswrapper[4715]: I1009 07:47:45.970457 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:45Z","lastTransitionTime":"2025-10-09T07:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.073265 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.073300 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.073311 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.073327 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.073339 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:46Z","lastTransitionTime":"2025-10-09T07:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.176085 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.176460 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.176634 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.176777 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.176911 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:46Z","lastTransitionTime":"2025-10-09T07:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.281190 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.281610 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.282014 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.282704 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.282904 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:46Z","lastTransitionTime":"2025-10-09T07:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.387776 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.387840 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.387866 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.387897 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.387919 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:46Z","lastTransitionTime":"2025-10-09T07:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.491598 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.491658 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.491676 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.491701 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.491720 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:46Z","lastTransitionTime":"2025-10-09T07:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.594534 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.594619 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.594641 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.594674 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.594696 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:46Z","lastTransitionTime":"2025-10-09T07:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.697843 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.697910 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.697931 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.697962 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.697979 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:46Z","lastTransitionTime":"2025-10-09T07:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.801828 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.801892 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.801902 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.801922 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.801934 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:46Z","lastTransitionTime":"2025-10-09T07:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.904390 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.904512 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.904541 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.904566 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:46 crc kubenswrapper[4715]: I1009 07:47:46.904585 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:46Z","lastTransitionTime":"2025-10-09T07:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.007301 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.007359 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.007384 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.007412 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.007468 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:47Z","lastTransitionTime":"2025-10-09T07:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.110047 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.110152 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.110175 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.110201 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.110221 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:47Z","lastTransitionTime":"2025-10-09T07:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.136303 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.136351 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.136398 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.136311 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:47 crc kubenswrapper[4715]: E1009 07:47:47.136582 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:47 crc kubenswrapper[4715]: E1009 07:47:47.136701 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:47 crc kubenswrapper[4715]: E1009 07:47:47.136780 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:47 crc kubenswrapper[4715]: E1009 07:47:47.137014 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.213529 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.213655 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.213680 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.213711 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.213737 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:47Z","lastTransitionTime":"2025-10-09T07:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.320748 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.320816 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.320834 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.320859 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.320896 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:47Z","lastTransitionTime":"2025-10-09T07:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.424713 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.424760 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.424777 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.424803 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.424820 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:47Z","lastTransitionTime":"2025-10-09T07:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.527656 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.527711 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.527721 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.527737 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.527747 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:47Z","lastTransitionTime":"2025-10-09T07:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.631143 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.631224 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.631251 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.631283 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.631306 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:47Z","lastTransitionTime":"2025-10-09T07:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.734037 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.734102 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.734124 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.734154 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.734177 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:47Z","lastTransitionTime":"2025-10-09T07:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.836544 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.836611 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.836630 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.836658 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.836681 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:47Z","lastTransitionTime":"2025-10-09T07:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.939769 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.939804 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.939822 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.939844 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:47 crc kubenswrapper[4715]: I1009 07:47:47.939860 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:47Z","lastTransitionTime":"2025-10-09T07:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.042358 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.042396 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.042407 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.042439 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.042451 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:48Z","lastTransitionTime":"2025-10-09T07:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.145270 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.145325 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.145342 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.145364 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.145383 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:48Z","lastTransitionTime":"2025-10-09T07:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.247967 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.248007 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.248017 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.248033 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.248045 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:48Z","lastTransitionTime":"2025-10-09T07:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.350201 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.350244 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.350255 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.350271 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.350283 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:48Z","lastTransitionTime":"2025-10-09T07:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.452912 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.452992 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.453016 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.453046 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.453071 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:48Z","lastTransitionTime":"2025-10-09T07:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.551964 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.552008 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.552019 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.552035 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.552046 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:48Z","lastTransitionTime":"2025-10-09T07:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.567815 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.567871 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.567887 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.567911 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.567929 4715 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-09T07:47:48Z","lastTransitionTime":"2025-10-09T07:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.606688 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm"] Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.607157 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.609619 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.610147 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.610691 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.611097 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.648863 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.648990 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.649068 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.649118 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.649220 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.688620 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=49.688585175 podStartE2EDuration="49.688585175s" podCreationTimestamp="2025-10-09 07:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:47:48.688310908 +0000 UTC m=+99.381114926" watchObservedRunningTime="2025-10-09 07:47:48.688585175 +0000 UTC m=+99.381389183" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.688839 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=76.688833672 podStartE2EDuration="1m16.688833672s" podCreationTimestamp="2025-10-09 07:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:47:48.671200313 +0000 UTC m=+99.364004351" watchObservedRunningTime="2025-10-09 07:47:48.688833672 +0000 UTC m=+99.381637681" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.747248 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8gf4x" podStartSLOduration=77.747224068 podStartE2EDuration="1m17.747224068s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:47:48.729442354 +0000 UTC m=+99.422246372" watchObservedRunningTime="2025-10-09 07:47:48.747224068 +0000 UTC m=+99.440028076" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.750166 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.750217 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.750262 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.750282 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.750311 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.750634 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.750714 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.751298 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.768645 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.772538 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-drllm\" (UID: \"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.795059 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6vp75" podStartSLOduration=77.794989101 podStartE2EDuration="1m17.794989101s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:47:48.794143718 +0000 UTC m=+99.486947736" watchObservedRunningTime="2025-10-09 07:47:48.794989101 +0000 UTC m=+99.487793129" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.807640 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pqt86" podStartSLOduration=77.807618419 podStartE2EDuration="1m17.807618419s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:47:48.807210228 +0000 UTC m=+99.500014246" watchObservedRunningTime="2025-10-09 07:47:48.807618419 +0000 UTC m=+99.500422447" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.822930 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5tfxq" podStartSLOduration=77.822904533 podStartE2EDuration="1m17.822904533s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:47:48.822474851 +0000 UTC m=+99.515278879" watchObservedRunningTime="2025-10-09 07:47:48.822904533 +0000 UTC m=+99.515708541" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.835683 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ksbvn" podStartSLOduration=77.835657924 podStartE2EDuration="1m17.835657924s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:47:48.835557071 +0000 UTC m=+99.528361079" watchObservedRunningTime="2025-10-09 07:47:48.835657924 +0000 UTC m=+99.528461952" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.883197 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podStartSLOduration=77.883176731 podStartE2EDuration="1m17.883176731s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:47:48.882934164 +0000 UTC m=+99.575738172" watchObservedRunningTime="2025-10-09 07:47:48.883176731 +0000 UTC m=+99.575980739" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.908928 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=32.9089039 podStartE2EDuration="32.9089039s" podCreationTimestamp="2025-10-09 07:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:47:48.906192554 +0000 UTC m=+99.598996572" watchObservedRunningTime="2025-10-09 07:47:48.9089039 +0000 UTC m=+99.601707918" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.931807 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.933852 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=75.933823957 podStartE2EDuration="1m15.933823957s" podCreationTimestamp="2025-10-09 07:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:47:48.932786697 +0000 UTC m=+99.625590705" watchObservedRunningTime="2025-10-09 07:47:48.933823957 +0000 UTC m=+99.626627965" Oct 09 07:47:48 crc kubenswrapper[4715]: I1009 07:47:48.954538 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=73.954508713 podStartE2EDuration="1m13.954508713s" podCreationTimestamp="2025-10-09 07:46:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:47:48.951309412 +0000 UTC m=+99.644113420" watchObservedRunningTime="2025-10-09 07:47:48.954508713 +0000 UTC m=+99.647312721" Oct 09 07:47:49 crc kubenswrapper[4715]: I1009 07:47:49.136574 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:49 crc kubenswrapper[4715]: I1009 07:47:49.137003 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:49 crc kubenswrapper[4715]: I1009 07:47:49.137521 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:49 crc kubenswrapper[4715]: E1009 07:47:49.137708 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:49 crc kubenswrapper[4715]: I1009 07:47:49.137733 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:49 crc kubenswrapper[4715]: E1009 07:47:49.137926 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:49 crc kubenswrapper[4715]: I1009 07:47:49.138577 4715 scope.go:117] "RemoveContainer" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" Oct 09 07:47:49 crc kubenswrapper[4715]: E1009 07:47:49.138724 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" Oct 09 07:47:49 crc kubenswrapper[4715]: E1009 07:47:49.138757 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:49 crc kubenswrapper[4715]: E1009 07:47:49.138835 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:49 crc kubenswrapper[4715]: I1009 07:47:49.460776 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:49 crc kubenswrapper[4715]: E1009 07:47:49.461072 4715 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:47:49 crc kubenswrapper[4715]: E1009 07:47:49.461218 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs podName:9a8fb3b8-b254-4bc3-b105-990eac79c77b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:53.461186375 +0000 UTC m=+164.153990413 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs") pod "network-metrics-daemon-fm6s2" (UID: "9a8fb3b8-b254-4bc3-b105-990eac79c77b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 09 07:47:49 crc kubenswrapper[4715]: I1009 07:47:49.743268 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" event={"ID":"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b","Type":"ContainerStarted","Data":"b4354a163bb270001b75d15a04e14194422bf0442bf5053c01e0a021bc31f438"} Oct 09 07:47:49 crc kubenswrapper[4715]: I1009 07:47:49.743325 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" event={"ID":"45f9faf7-fcd7-404e-9f96-0ac8e94d1a1b","Type":"ContainerStarted","Data":"e9b0526aec07477f3ed1f06a62a31b88b1d682b6f7d538d5392d9b17a2f641f3"} Oct 09 07:47:49 crc kubenswrapper[4715]: I1009 07:47:49.754779 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-drllm" podStartSLOduration=78.754761896 podStartE2EDuration="1m18.754761896s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:47:49.754682294 +0000 UTC m=+100.447486302" watchObservedRunningTime="2025-10-09 07:47:49.754761896 +0000 UTC m=+100.447565904" Oct 09 07:47:51 crc kubenswrapper[4715]: I1009 07:47:51.136672 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:51 crc kubenswrapper[4715]: I1009 07:47:51.136735 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:51 crc kubenswrapper[4715]: I1009 07:47:51.136708 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:51 crc kubenswrapper[4715]: I1009 07:47:51.136708 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:51 crc kubenswrapper[4715]: E1009 07:47:51.136940 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:51 crc kubenswrapper[4715]: E1009 07:47:51.137040 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:51 crc kubenswrapper[4715]: E1009 07:47:51.137148 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:51 crc kubenswrapper[4715]: E1009 07:47:51.137230 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:51 crc kubenswrapper[4715]: I1009 07:47:51.791771 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:47:51 crc kubenswrapper[4715]: I1009 07:47:51.793519 4715 scope.go:117] "RemoveContainer" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" Oct 09 07:47:51 crc kubenswrapper[4715]: E1009 07:47:51.793816 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" Oct 09 07:47:53 crc kubenswrapper[4715]: I1009 07:47:53.136681 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:53 crc kubenswrapper[4715]: I1009 07:47:53.137615 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:53 crc kubenswrapper[4715]: E1009 07:47:53.137804 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:53 crc kubenswrapper[4715]: I1009 07:47:53.137900 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:53 crc kubenswrapper[4715]: I1009 07:47:53.137865 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:53 crc kubenswrapper[4715]: E1009 07:47:53.137975 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:53 crc kubenswrapper[4715]: E1009 07:47:53.138189 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:53 crc kubenswrapper[4715]: E1009 07:47:53.138303 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:55 crc kubenswrapper[4715]: I1009 07:47:55.136703 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:55 crc kubenswrapper[4715]: I1009 07:47:55.136749 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:55 crc kubenswrapper[4715]: I1009 07:47:55.136836 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:55 crc kubenswrapper[4715]: E1009 07:47:55.137034 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:55 crc kubenswrapper[4715]: I1009 07:47:55.137063 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:55 crc kubenswrapper[4715]: E1009 07:47:55.137210 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:55 crc kubenswrapper[4715]: E1009 07:47:55.137346 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:55 crc kubenswrapper[4715]: E1009 07:47:55.137508 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:57 crc kubenswrapper[4715]: I1009 07:47:57.136270 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:57 crc kubenswrapper[4715]: I1009 07:47:57.136367 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:57 crc kubenswrapper[4715]: E1009 07:47:57.136555 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:57 crc kubenswrapper[4715]: I1009 07:47:57.136621 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:57 crc kubenswrapper[4715]: I1009 07:47:57.136379 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:57 crc kubenswrapper[4715]: E1009 07:47:57.136750 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:57 crc kubenswrapper[4715]: E1009 07:47:57.136852 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:47:57 crc kubenswrapper[4715]: E1009 07:47:57.136980 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:59 crc kubenswrapper[4715]: I1009 07:47:59.136555 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:47:59 crc kubenswrapper[4715]: I1009 07:47:59.136595 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:47:59 crc kubenswrapper[4715]: E1009 07:47:59.136792 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:47:59 crc kubenswrapper[4715]: I1009 07:47:59.136641 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:47:59 crc kubenswrapper[4715]: I1009 07:47:59.136599 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:47:59 crc kubenswrapper[4715]: E1009 07:47:59.137088 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:47:59 crc kubenswrapper[4715]: E1009 07:47:59.137215 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:47:59 crc kubenswrapper[4715]: E1009 07:47:59.137323 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:01 crc kubenswrapper[4715]: I1009 07:48:01.135837 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:01 crc kubenswrapper[4715]: I1009 07:48:01.135902 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:01 crc kubenswrapper[4715]: I1009 07:48:01.135863 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:01 crc kubenswrapper[4715]: I1009 07:48:01.135849 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:01 crc kubenswrapper[4715]: E1009 07:48:01.136068 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:01 crc kubenswrapper[4715]: E1009 07:48:01.136204 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:01 crc kubenswrapper[4715]: E1009 07:48:01.136337 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:01 crc kubenswrapper[4715]: E1009 07:48:01.136473 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:03 crc kubenswrapper[4715]: I1009 07:48:03.136014 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:03 crc kubenswrapper[4715]: I1009 07:48:03.136124 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:03 crc kubenswrapper[4715]: I1009 07:48:03.136127 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:03 crc kubenswrapper[4715]: I1009 07:48:03.136031 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:03 crc kubenswrapper[4715]: E1009 07:48:03.136237 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:03 crc kubenswrapper[4715]: E1009 07:48:03.136559 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:03 crc kubenswrapper[4715]: E1009 07:48:03.136588 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:03 crc kubenswrapper[4715]: E1009 07:48:03.136734 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:05 crc kubenswrapper[4715]: I1009 07:48:05.135970 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:05 crc kubenswrapper[4715]: I1009 07:48:05.136101 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:05 crc kubenswrapper[4715]: E1009 07:48:05.136139 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:05 crc kubenswrapper[4715]: I1009 07:48:05.136223 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:05 crc kubenswrapper[4715]: E1009 07:48:05.136479 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:05 crc kubenswrapper[4715]: E1009 07:48:05.136686 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:05 crc kubenswrapper[4715]: I1009 07:48:05.136003 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:05 crc kubenswrapper[4715]: E1009 07:48:05.137180 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:05 crc kubenswrapper[4715]: I1009 07:48:05.804284 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6vp75_6e61f2cb-cd6d-46d6-bbb6-dd99919b893d/kube-multus/1.log" Oct 09 07:48:05 crc kubenswrapper[4715]: I1009 07:48:05.804898 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6vp75_6e61f2cb-cd6d-46d6-bbb6-dd99919b893d/kube-multus/0.log" Oct 09 07:48:05 crc kubenswrapper[4715]: I1009 07:48:05.804937 4715 generic.go:334] "Generic (PLEG): container finished" podID="6e61f2cb-cd6d-46d6-bbb6-dd99919b893d" containerID="4e02a5b9a040e142c2a3f8ca488f0de0e42b0e01fff8a9987ea1ee5c354b1e31" exitCode=1 Oct 09 07:48:05 crc kubenswrapper[4715]: I1009 07:48:05.804970 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6vp75" event={"ID":"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d","Type":"ContainerDied","Data":"4e02a5b9a040e142c2a3f8ca488f0de0e42b0e01fff8a9987ea1ee5c354b1e31"} Oct 09 07:48:05 crc kubenswrapper[4715]: I1009 07:48:05.805009 4715 scope.go:117] "RemoveContainer" containerID="d171b3d3faf9677e74d3e03a801accdc34d690d6db4b03bf63b95f7565afe8b9" Oct 09 07:48:05 crc kubenswrapper[4715]: I1009 07:48:05.805393 4715 scope.go:117] "RemoveContainer" containerID="4e02a5b9a040e142c2a3f8ca488f0de0e42b0e01fff8a9987ea1ee5c354b1e31" Oct 09 07:48:05 crc kubenswrapper[4715]: E1009 07:48:05.805558 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-6vp75_openshift-multus(6e61f2cb-cd6d-46d6-bbb6-dd99919b893d)\"" pod="openshift-multus/multus-6vp75" podUID="6e61f2cb-cd6d-46d6-bbb6-dd99919b893d" Oct 09 07:48:06 crc kubenswrapper[4715]: I1009 07:48:06.137665 4715 scope.go:117] "RemoveContainer" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" Oct 09 07:48:06 crc kubenswrapper[4715]: E1009 07:48:06.138088 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z9ztn_openshift-ovn-kubernetes(1d6cb14a-7329-4a80-aff2-acd9142558d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" Oct 09 07:48:06 crc kubenswrapper[4715]: I1009 07:48:06.811458 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6vp75_6e61f2cb-cd6d-46d6-bbb6-dd99919b893d/kube-multus/1.log" Oct 09 07:48:07 crc kubenswrapper[4715]: I1009 07:48:07.136524 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:07 crc kubenswrapper[4715]: I1009 07:48:07.136521 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:07 crc kubenswrapper[4715]: E1009 07:48:07.136692 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:07 crc kubenswrapper[4715]: E1009 07:48:07.136722 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:07 crc kubenswrapper[4715]: I1009 07:48:07.136521 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:07 crc kubenswrapper[4715]: E1009 07:48:07.136819 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:07 crc kubenswrapper[4715]: I1009 07:48:07.137112 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:07 crc kubenswrapper[4715]: E1009 07:48:07.137188 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:09 crc kubenswrapper[4715]: I1009 07:48:09.136802 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:09 crc kubenswrapper[4715]: I1009 07:48:09.136863 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:09 crc kubenswrapper[4715]: E1009 07:48:09.137688 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:09 crc kubenswrapper[4715]: I1009 07:48:09.136889 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:09 crc kubenswrapper[4715]: E1009 07:48:09.137499 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:09 crc kubenswrapper[4715]: E1009 07:48:09.137813 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:09 crc kubenswrapper[4715]: I1009 07:48:09.136889 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:09 crc kubenswrapper[4715]: E1009 07:48:09.137944 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:10 crc kubenswrapper[4715]: E1009 07:48:10.076683 4715 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 09 07:48:10 crc kubenswrapper[4715]: E1009 07:48:10.223730 4715 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 07:48:11 crc kubenswrapper[4715]: I1009 07:48:11.159207 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:11 crc kubenswrapper[4715]: E1009 07:48:11.159402 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:11 crc kubenswrapper[4715]: I1009 07:48:11.159518 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:11 crc kubenswrapper[4715]: I1009 07:48:11.159558 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:11 crc kubenswrapper[4715]: E1009 07:48:11.159701 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:11 crc kubenswrapper[4715]: I1009 07:48:11.159795 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:11 crc kubenswrapper[4715]: E1009 07:48:11.159919 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:11 crc kubenswrapper[4715]: E1009 07:48:11.160056 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:13 crc kubenswrapper[4715]: I1009 07:48:13.136257 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:13 crc kubenswrapper[4715]: I1009 07:48:13.136284 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:13 crc kubenswrapper[4715]: I1009 07:48:13.136564 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:13 crc kubenswrapper[4715]: E1009 07:48:13.136476 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:13 crc kubenswrapper[4715]: E1009 07:48:13.136743 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:13 crc kubenswrapper[4715]: E1009 07:48:13.136863 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:13 crc kubenswrapper[4715]: I1009 07:48:13.136987 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:13 crc kubenswrapper[4715]: E1009 07:48:13.137069 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:15 crc kubenswrapper[4715]: I1009 07:48:15.136594 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:15 crc kubenswrapper[4715]: I1009 07:48:15.136731 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:15 crc kubenswrapper[4715]: I1009 07:48:15.136744 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:15 crc kubenswrapper[4715]: I1009 07:48:15.136790 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:15 crc kubenswrapper[4715]: E1009 07:48:15.138012 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:15 crc kubenswrapper[4715]: E1009 07:48:15.138176 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:15 crc kubenswrapper[4715]: E1009 07:48:15.138374 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:15 crc kubenswrapper[4715]: E1009 07:48:15.138591 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:15 crc kubenswrapper[4715]: E1009 07:48:15.225089 4715 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 07:48:17 crc kubenswrapper[4715]: I1009 07:48:17.135983 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:17 crc kubenswrapper[4715]: I1009 07:48:17.136052 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:17 crc kubenswrapper[4715]: I1009 07:48:17.136091 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:17 crc kubenswrapper[4715]: I1009 07:48:17.136115 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:17 crc kubenswrapper[4715]: I1009 07:48:17.136807 4715 scope.go:117] "RemoveContainer" containerID="4e02a5b9a040e142c2a3f8ca488f0de0e42b0e01fff8a9987ea1ee5c354b1e31" Oct 09 07:48:17 crc kubenswrapper[4715]: E1009 07:48:17.139957 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:17 crc kubenswrapper[4715]: E1009 07:48:17.140151 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:17 crc kubenswrapper[4715]: E1009 07:48:17.140457 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:17 crc kubenswrapper[4715]: E1009 07:48:17.140568 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:17 crc kubenswrapper[4715]: I1009 07:48:17.857220 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6vp75_6e61f2cb-cd6d-46d6-bbb6-dd99919b893d/kube-multus/1.log" Oct 09 07:48:17 crc kubenswrapper[4715]: I1009 07:48:17.857730 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6vp75" event={"ID":"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d","Type":"ContainerStarted","Data":"e5bd879138998ee05f837dc613e67592e7abe570a8c55d71b8fa18446d82d746"} Oct 09 07:48:19 crc kubenswrapper[4715]: I1009 07:48:19.136514 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:19 crc kubenswrapper[4715]: I1009 07:48:19.136630 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:19 crc kubenswrapper[4715]: I1009 07:48:19.136676 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:19 crc kubenswrapper[4715]: I1009 07:48:19.137244 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:19 crc kubenswrapper[4715]: E1009 07:48:19.137511 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:19 crc kubenswrapper[4715]: E1009 07:48:19.137631 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:19 crc kubenswrapper[4715]: E1009 07:48:19.137955 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:19 crc kubenswrapper[4715]: E1009 07:48:19.138057 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:19 crc kubenswrapper[4715]: I1009 07:48:19.138176 4715 scope.go:117] "RemoveContainer" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" Oct 09 07:48:19 crc kubenswrapper[4715]: I1009 07:48:19.865912 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/3.log" Oct 09 07:48:19 crc kubenswrapper[4715]: I1009 07:48:19.868252 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerStarted","Data":"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e"} Oct 09 07:48:19 crc kubenswrapper[4715]: I1009 07:48:19.868912 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:48:19 crc kubenswrapper[4715]: I1009 07:48:19.902928 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podStartSLOduration=108.902884795 podStartE2EDuration="1m48.902884795s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:19.902291228 +0000 UTC m=+130.595095296" watchObservedRunningTime="2025-10-09 07:48:19.902884795 +0000 UTC m=+130.595688853" Oct 09 07:48:20 crc kubenswrapper[4715]: I1009 07:48:20.057592 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fm6s2"] Oct 09 07:48:20 crc kubenswrapper[4715]: I1009 07:48:20.057698 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:20 crc kubenswrapper[4715]: E1009 07:48:20.057787 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:20 crc kubenswrapper[4715]: E1009 07:48:20.225919 4715 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 09 07:48:21 crc kubenswrapper[4715]: I1009 07:48:21.136464 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:21 crc kubenswrapper[4715]: I1009 07:48:21.136550 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:21 crc kubenswrapper[4715]: I1009 07:48:21.136662 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:21 crc kubenswrapper[4715]: E1009 07:48:21.136865 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:21 crc kubenswrapper[4715]: E1009 07:48:21.137055 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:21 crc kubenswrapper[4715]: E1009 07:48:21.137191 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:22 crc kubenswrapper[4715]: I1009 07:48:22.136367 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:22 crc kubenswrapper[4715]: E1009 07:48:22.136875 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:23 crc kubenswrapper[4715]: I1009 07:48:23.139321 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:23 crc kubenswrapper[4715]: I1009 07:48:23.139408 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:23 crc kubenswrapper[4715]: E1009 07:48:23.139620 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:23 crc kubenswrapper[4715]: I1009 07:48:23.139774 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:23 crc kubenswrapper[4715]: E1009 07:48:23.139845 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:23 crc kubenswrapper[4715]: E1009 07:48:23.139933 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:24 crc kubenswrapper[4715]: I1009 07:48:24.136563 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:24 crc kubenswrapper[4715]: E1009 07:48:24.136751 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fm6s2" podUID="9a8fb3b8-b254-4bc3-b105-990eac79c77b" Oct 09 07:48:25 crc kubenswrapper[4715]: I1009 07:48:25.136798 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:25 crc kubenswrapper[4715]: I1009 07:48:25.136835 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:25 crc kubenswrapper[4715]: E1009 07:48:25.137027 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 09 07:48:25 crc kubenswrapper[4715]: I1009 07:48:25.136835 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:25 crc kubenswrapper[4715]: E1009 07:48:25.137206 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 09 07:48:25 crc kubenswrapper[4715]: E1009 07:48:25.137341 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 09 07:48:26 crc kubenswrapper[4715]: I1009 07:48:26.136271 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:26 crc kubenswrapper[4715]: I1009 07:48:26.139630 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 09 07:48:26 crc kubenswrapper[4715]: I1009 07:48:26.144071 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 09 07:48:27 crc kubenswrapper[4715]: I1009 07:48:27.136669 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:27 crc kubenswrapper[4715]: I1009 07:48:27.136728 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:27 crc kubenswrapper[4715]: I1009 07:48:27.136737 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:27 crc kubenswrapper[4715]: I1009 07:48:27.139601 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 09 07:48:27 crc kubenswrapper[4715]: I1009 07:48:27.140992 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 09 07:48:27 crc kubenswrapper[4715]: I1009 07:48:27.141414 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 09 07:48:27 crc kubenswrapper[4715]: I1009 07:48:27.141964 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.251191 4715 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.300152 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5tfh2"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.300961 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.301988 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.302906 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.304318 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.304967 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.305370 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.306040 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.310875 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.311286 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.311726 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.312034 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.312304 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.312894 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.314630 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.315798 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.323829 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.324946 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.327004 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xf4mc"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.327745 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.329244 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.329634 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.329647 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.329736 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.329806 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.329820 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.329909 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.330013 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.330143 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.330254 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.330445 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.330495 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.330605 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.330708 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.330716 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.330804 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.330930 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.331024 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.331129 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.331240 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.331328 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.331367 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.331453 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.331964 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.332017 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.332190 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.340720 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.339814 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hbzph"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.354177 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.372668 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.374568 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.375399 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.375592 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.375780 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.375892 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.376035 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.376316 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/67298510-55be-44bd-a0d7-0988939fdf66-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5v4sd\" (UID: \"67298510-55be-44bd-a0d7-0988939fdf66\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.376346 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lr8w\" (UniqueName: \"kubernetes.io/projected/bae5cd41-0015-4df3-bfe7-c2937a5938b6-kube-api-access-7lr8w\") pod \"machine-approver-56656f9798-6t7zt\" (UID: \"bae5cd41-0015-4df3-bfe7-c2937a5938b6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.376380 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e519e5-cb0b-40a4-a419-546ac0a3de69-serving-cert\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.376400 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.376414 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.376318 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dfctz"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.376843 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.376433 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3985b442-52af-4652-a129-de4aa904321f-serving-cert\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.376957 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bae5cd41-0015-4df3-bfe7-c2937a5938b6-config\") pod \"machine-approver-56656f9798-6t7zt\" (UID: \"bae5cd41-0015-4df3-bfe7-c2937a5938b6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.376998 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3985b442-52af-4652-a129-de4aa904321f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377058 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3985b442-52af-4652-a129-de4aa904321f-encryption-config\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377084 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bae5cd41-0015-4df3-bfe7-c2937a5938b6-machine-approver-tls\") pod \"machine-approver-56656f9798-6t7zt\" (UID: \"bae5cd41-0015-4df3-bfe7-c2937a5938b6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377113 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377222 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377398 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377532 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377627 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-5fdhg"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377674 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377113 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfbvh\" (UniqueName: \"kubernetes.io/projected/67298510-55be-44bd-a0d7-0988939fdf66-kube-api-access-jfbvh\") pod \"cluster-samples-operator-665b6dd947-5v4sd\" (UID: \"67298510-55be-44bd-a0d7-0988939fdf66\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377775 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7b7054-dbf2-4878-9ded-127719d0afb3-serving-cert\") pod \"openshift-config-operator-7777fb866f-z9cgk\" (UID: \"3f7b7054-dbf2-4878-9ded-127719d0afb3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377800 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsd9j\" (UniqueName: \"kubernetes.io/projected/d6e519e5-cb0b-40a4-a419-546ac0a3de69-kube-api-access-bsd9j\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377817 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3985b442-52af-4652-a129-de4aa904321f-audit-dir\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377845 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tph28\" (UniqueName: \"kubernetes.io/projected/3f7b7054-dbf2-4878-9ded-127719d0afb3-kube-api-access-tph28\") pod \"openshift-config-operator-7777fb866f-z9cgk\" (UID: \"3f7b7054-dbf2-4878-9ded-127719d0afb3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377864 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-config\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377883 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/749e25cc-b0d4-42b6-831c-d6af247de9f9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cjrbk\" (UID: \"749e25cc-b0d4-42b6-831c-d6af247de9f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377900 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3985b442-52af-4652-a129-de4aa904321f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377928 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/749e25cc-b0d4-42b6-831c-d6af247de9f9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cjrbk\" (UID: \"749e25cc-b0d4-42b6-831c-d6af247de9f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377949 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93259cc2-6847-41dc-a61d-83e7b9e67f3a-config\") pod \"machine-api-operator-5694c8668f-5tfh2\" (UID: \"93259cc2-6847-41dc-a61d-83e7b9e67f3a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377972 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bae5cd41-0015-4df3-bfe7-c2937a5938b6-auth-proxy-config\") pod \"machine-approver-56656f9798-6t7zt\" (UID: \"bae5cd41-0015-4df3-bfe7-c2937a5938b6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.377993 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57xbw\" (UniqueName: \"kubernetes.io/projected/749e25cc-b0d4-42b6-831c-d6af247de9f9-kube-api-access-57xbw\") pod \"openshift-apiserver-operator-796bbdcf4f-cjrbk\" (UID: \"749e25cc-b0d4-42b6-831c-d6af247de9f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.378014 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3985b442-52af-4652-a129-de4aa904321f-etcd-client\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.378025 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.378059 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3f7b7054-dbf2-4878-9ded-127719d0afb3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z9cgk\" (UID: \"3f7b7054-dbf2-4878-9ded-127719d0afb3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.378083 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-client-ca\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.378107 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmq75\" (UniqueName: \"kubernetes.io/projected/93259cc2-6847-41dc-a61d-83e7b9e67f3a-kube-api-access-pmq75\") pod \"machine-api-operator-5694c8668f-5tfh2\" (UID: \"93259cc2-6847-41dc-a61d-83e7b9e67f3a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.378131 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/93259cc2-6847-41dc-a61d-83e7b9e67f3a-images\") pod \"machine-api-operator-5694c8668f-5tfh2\" (UID: \"93259cc2-6847-41dc-a61d-83e7b9e67f3a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.378158 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3985b442-52af-4652-a129-de4aa904321f-audit-policies\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.378181 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/93259cc2-6847-41dc-a61d-83e7b9e67f3a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5tfh2\" (UID: \"93259cc2-6847-41dc-a61d-83e7b9e67f3a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.378198 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz6qx\" (UniqueName: \"kubernetes.io/projected/3985b442-52af-4652-a129-de4aa904321f-kube-api-access-sz6qx\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.379221 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.379897 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q5ck7"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.380319 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.380522 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.381460 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xh68m"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.382601 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.383340 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-sq956"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.384088 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-sq956" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.386391 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.387209 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w58s7"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.387851 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lr9f9"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.388380 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.388972 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.389351 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w58s7" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.392039 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.394533 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.395081 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.396304 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.396633 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.397866 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.398538 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.398953 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.399240 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.399427 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-b4lqp"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.399994 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.400438 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.416140 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.416849 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.417147 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.417331 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.417549 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.417757 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.417807 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.417962 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.417995 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.418041 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.418115 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.418143 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.418251 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.418365 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.418481 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.418117 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.418592 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.418701 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.418830 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.418989 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.418254 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.419217 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.419261 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.419337 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.419489 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.419607 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.419701 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.423227 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.423313 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.423390 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.423513 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.423521 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.431591 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.433285 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.435100 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.435236 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.435380 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.439412 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.442004 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.442368 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.444284 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.451032 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-txj6v"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.451199 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.451787 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.452077 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.452232 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.453745 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5tfh2"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.453786 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.454159 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.454308 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.454352 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.454468 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.454602 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.456003 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.458532 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.459275 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.461564 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.467661 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.468482 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.468515 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.469533 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.470217 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.470299 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.470474 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.470714 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.471408 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.471882 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hbn7p"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.472478 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hbn7p" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.472920 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.472931 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.473364 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.475362 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.475563 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4gncq"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.476292 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.477448 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4gncq" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.478696 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfbvh\" (UniqueName: \"kubernetes.io/projected/67298510-55be-44bd-a0d7-0988939fdf66-kube-api-access-jfbvh\") pod \"cluster-samples-operator-665b6dd947-5v4sd\" (UID: \"67298510-55be-44bd-a0d7-0988939fdf66\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.478726 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7b7054-dbf2-4878-9ded-127719d0afb3-serving-cert\") pod \"openshift-config-operator-7777fb866f-z9cgk\" (UID: \"3f7b7054-dbf2-4878-9ded-127719d0afb3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.478753 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7d543303-0d6d-4c3d-bb4a-bb216d9def25-etcd-ca\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.478770 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-service-ca\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.478789 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsd9j\" (UniqueName: \"kubernetes.io/projected/d6e519e5-cb0b-40a4-a419-546ac0a3de69-kube-api-access-bsd9j\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.478808 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3985b442-52af-4652-a129-de4aa904321f-audit-dir\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.478873 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tph28\" (UniqueName: \"kubernetes.io/projected/3f7b7054-dbf2-4878-9ded-127719d0afb3-kube-api-access-tph28\") pod \"openshift-config-operator-7777fb866f-z9cgk\" (UID: \"3f7b7054-dbf2-4878-9ded-127719d0afb3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.478954 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-config\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.478996 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/749e25cc-b0d4-42b6-831c-d6af247de9f9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cjrbk\" (UID: \"749e25cc-b0d4-42b6-831c-d6af247de9f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479023 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3985b442-52af-4652-a129-de4aa904321f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479034 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479063 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/749e25cc-b0d4-42b6-831c-d6af247de9f9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cjrbk\" (UID: \"749e25cc-b0d4-42b6-831c-d6af247de9f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479086 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93259cc2-6847-41dc-a61d-83e7b9e67f3a-config\") pod \"machine-api-operator-5694c8668f-5tfh2\" (UID: \"93259cc2-6847-41dc-a61d-83e7b9e67f3a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479108 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-oauth-config\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479130 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bae5cd41-0015-4df3-bfe7-c2937a5938b6-auth-proxy-config\") pod \"machine-approver-56656f9798-6t7zt\" (UID: \"bae5cd41-0015-4df3-bfe7-c2937a5938b6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479149 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57xbw\" (UniqueName: \"kubernetes.io/projected/749e25cc-b0d4-42b6-831c-d6af247de9f9-kube-api-access-57xbw\") pod \"openshift-apiserver-operator-796bbdcf4f-cjrbk\" (UID: \"749e25cc-b0d4-42b6-831c-d6af247de9f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479168 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3985b442-52af-4652-a129-de4aa904321f-etcd-client\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479179 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479199 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d543303-0d6d-4c3d-bb4a-bb216d9def25-serving-cert\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479224 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3f7b7054-dbf2-4878-9ded-127719d0afb3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z9cgk\" (UID: \"3f7b7054-dbf2-4878-9ded-127719d0afb3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479240 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-client-ca\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479259 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmq75\" (UniqueName: \"kubernetes.io/projected/93259cc2-6847-41dc-a61d-83e7b9e67f3a-kube-api-access-pmq75\") pod \"machine-api-operator-5694c8668f-5tfh2\" (UID: \"93259cc2-6847-41dc-a61d-83e7b9e67f3a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479276 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d543303-0d6d-4c3d-bb4a-bb216d9def25-etcd-client\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479291 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/93259cc2-6847-41dc-a61d-83e7b9e67f3a-images\") pod \"machine-api-operator-5694c8668f-5tfh2\" (UID: \"93259cc2-6847-41dc-a61d-83e7b9e67f3a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479307 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-oauth-serving-cert\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479325 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b34aa1fd-b226-4e6d-8854-786cb7f5dc67-metrics-tls\") pod \"dns-operator-744455d44c-w58s7\" (UID: \"b34aa1fd-b226-4e6d-8854-786cb7f5dc67\") " pod="openshift-dns-operator/dns-operator-744455d44c-w58s7" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479355 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3985b442-52af-4652-a129-de4aa904321f-audit-policies\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479375 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d543303-0d6d-4c3d-bb4a-bb216d9def25-config\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479395 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d543303-0d6d-4c3d-bb4a-bb216d9def25-etcd-service-ca\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479435 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz6qx\" (UniqueName: \"kubernetes.io/projected/3985b442-52af-4652-a129-de4aa904321f-kube-api-access-sz6qx\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479470 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/93259cc2-6847-41dc-a61d-83e7b9e67f3a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5tfh2\" (UID: \"93259cc2-6847-41dc-a61d-83e7b9e67f3a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479488 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-trusted-ca-bundle\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479505 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb2bg\" (UniqueName: \"kubernetes.io/projected/b34aa1fd-b226-4e6d-8854-786cb7f5dc67-kube-api-access-wb2bg\") pod \"dns-operator-744455d44c-w58s7\" (UID: \"b34aa1fd-b226-4e6d-8854-786cb7f5dc67\") " pod="openshift-dns-operator/dns-operator-744455d44c-w58s7" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479529 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/67298510-55be-44bd-a0d7-0988939fdf66-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5v4sd\" (UID: \"67298510-55be-44bd-a0d7-0988939fdf66\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479549 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-serving-cert\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479572 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lr8w\" (UniqueName: \"kubernetes.io/projected/bae5cd41-0015-4df3-bfe7-c2937a5938b6-kube-api-access-7lr8w\") pod \"machine-approver-56656f9798-6t7zt\" (UID: \"bae5cd41-0015-4df3-bfe7-c2937a5938b6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479590 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-config\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479610 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e519e5-cb0b-40a4-a419-546ac0a3de69-serving-cert\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479626 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qm46\" (UniqueName: \"kubernetes.io/projected/3c1c9983-60a8-4db2-866c-15deb7220cb9-kube-api-access-6qm46\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479647 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479663 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3985b442-52af-4652-a129-de4aa904321f-serving-cert\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479691 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bae5cd41-0015-4df3-bfe7-c2937a5938b6-config\") pod \"machine-approver-56656f9798-6t7zt\" (UID: \"bae5cd41-0015-4df3-bfe7-c2937a5938b6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479707 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3985b442-52af-4652-a129-de4aa904321f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479725 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3985b442-52af-4652-a129-de4aa904321f-encryption-config\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479742 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wphrc\" (UniqueName: \"kubernetes.io/projected/7d543303-0d6d-4c3d-bb4a-bb216d9def25-kube-api-access-wphrc\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479764 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bae5cd41-0015-4df3-bfe7-c2937a5938b6-machine-approver-tls\") pod \"machine-approver-56656f9798-6t7zt\" (UID: \"bae5cd41-0015-4df3-bfe7-c2937a5938b6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479853 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3985b442-52af-4652-a129-de4aa904321f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479899 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.480286 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.480368 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-config\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.479045 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3985b442-52af-4652-a129-de4aa904321f-audit-dir\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.481127 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/749e25cc-b0d4-42b6-831c-d6af247de9f9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cjrbk\" (UID: \"749e25cc-b0d4-42b6-831c-d6af247de9f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.481305 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3f7b7054-dbf2-4878-9ded-127719d0afb3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-z9cgk\" (UID: \"3f7b7054-dbf2-4878-9ded-127719d0afb3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.481772 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93259cc2-6847-41dc-a61d-83e7b9e67f3a-config\") pod \"machine-api-operator-5694c8668f-5tfh2\" (UID: \"93259cc2-6847-41dc-a61d-83e7b9e67f3a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.483504 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bae5cd41-0015-4df3-bfe7-c2937a5938b6-auth-proxy-config\") pod \"machine-approver-56656f9798-6t7zt\" (UID: \"bae5cd41-0015-4df3-bfe7-c2937a5938b6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.483401 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/93259cc2-6847-41dc-a61d-83e7b9e67f3a-images\") pod \"machine-api-operator-5694c8668f-5tfh2\" (UID: \"93259cc2-6847-41dc-a61d-83e7b9e67f3a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.483944 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3985b442-52af-4652-a129-de4aa904321f-audit-policies\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.484980 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-client-ca\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.484702 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bae5cd41-0015-4df3-bfe7-c2937a5938b6-config\") pod \"machine-approver-56656f9798-6t7zt\" (UID: \"bae5cd41-0015-4df3-bfe7-c2937a5938b6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.486690 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3985b442-52af-4652-a129-de4aa904321f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.488782 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3985b442-52af-4652-a129-de4aa904321f-etcd-client\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.489285 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f7b7054-dbf2-4878-9ded-127719d0afb3-serving-cert\") pod \"openshift-config-operator-7777fb866f-z9cgk\" (UID: \"3f7b7054-dbf2-4878-9ded-127719d0afb3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.489397 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/67298510-55be-44bd-a0d7-0988939fdf66-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-5v4sd\" (UID: \"67298510-55be-44bd-a0d7-0988939fdf66\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.490221 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/93259cc2-6847-41dc-a61d-83e7b9e67f3a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-5tfh2\" (UID: \"93259cc2-6847-41dc-a61d-83e7b9e67f3a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.490565 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.492228 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.493301 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3985b442-52af-4652-a129-de4aa904321f-encryption-config\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.493344 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/749e25cc-b0d4-42b6-831c-d6af247de9f9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cjrbk\" (UID: \"749e25cc-b0d4-42b6-831c-d6af247de9f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.494639 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.495286 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e519e5-cb0b-40a4-a419-546ac0a3de69-serving-cert\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.495406 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.501287 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3985b442-52af-4652-a129-de4aa904321f-serving-cert\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.504310 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.505522 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.505815 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bae5cd41-0015-4df3-bfe7-c2937a5938b6-machine-approver-tls\") pod \"machine-approver-56656f9798-6t7zt\" (UID: \"bae5cd41-0015-4df3-bfe7-c2937a5938b6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.506450 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qf4bm"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.507351 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.508106 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.508356 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.508667 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.509271 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.509729 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.517469 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hzbn9"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.518639 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.521648 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.526578 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.526758 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-cjqqc"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.527517 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cjqqc" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.530511 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-575dw"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.531486 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.531612 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.533155 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.535941 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.537784 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xf4mc"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.539581 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.542031 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5fdhg"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.543199 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.547274 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.548644 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.549691 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.550882 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.551846 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.553950 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.556319 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q5ck7"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.557310 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w58s7"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.558525 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hbzph"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.559975 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.561325 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-sq956"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.562571 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dfctz"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.563710 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.563831 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.565239 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.566744 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lr9f9"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.568160 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7wthp"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.569195 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.569787 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xh68m"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.571592 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.573073 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.574785 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.577492 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hbn7p"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.578763 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7wthp"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.578837 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-txj6v"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.579737 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4gncq"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.580295 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-serving-cert\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.580433 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-config\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.580557 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qm46\" (UniqueName: \"kubernetes.io/projected/3c1c9983-60a8-4db2-866c-15deb7220cb9-kube-api-access-6qm46\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.580688 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wphrc\" (UniqueName: \"kubernetes.io/projected/7d543303-0d6d-4c3d-bb4a-bb216d9def25-kube-api-access-wphrc\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.580790 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7d543303-0d6d-4c3d-bb4a-bb216d9def25-etcd-ca\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.580888 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-service-ca\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.581020 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-oauth-config\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.581148 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d543303-0d6d-4c3d-bb4a-bb216d9def25-serving-cert\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.581316 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d543303-0d6d-4c3d-bb4a-bb216d9def25-etcd-client\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.581389 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-config\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.581479 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/7d543303-0d6d-4c3d-bb4a-bb216d9def25-etcd-ca\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.581407 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-oauth-serving-cert\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.581622 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b34aa1fd-b226-4e6d-8854-786cb7f5dc67-metrics-tls\") pod \"dns-operator-744455d44c-w58s7\" (UID: \"b34aa1fd-b226-4e6d-8854-786cb7f5dc67\") " pod="openshift-dns-operator/dns-operator-744455d44c-w58s7" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.581706 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d543303-0d6d-4c3d-bb4a-bb216d9def25-config\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.581790 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d543303-0d6d-4c3d-bb4a-bb216d9def25-etcd-service-ca\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.580804 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-575dw"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.581891 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-trusted-ca-bundle\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.582055 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb2bg\" (UniqueName: \"kubernetes.io/projected/b34aa1fd-b226-4e6d-8854-786cb7f5dc67-kube-api-access-wb2bg\") pod \"dns-operator-744455d44c-w58s7\" (UID: \"b34aa1fd-b226-4e6d-8854-786cb7f5dc67\") " pod="openshift-dns-operator/dns-operator-744455d44c-w58s7" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.582178 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-service-ca\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.582471 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.583523 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hzbn9"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.583708 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-oauth-serving-cert\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.583768 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d543303-0d6d-4c3d-bb4a-bb216d9def25-config\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.584111 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-serving-cert\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.585220 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.585403 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-trusted-ca-bundle\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.585459 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d543303-0d6d-4c3d-bb4a-bb216d9def25-etcd-service-ca\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.585814 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d543303-0d6d-4c3d-bb4a-bb216d9def25-etcd-client\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.586048 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d543303-0d6d-4c3d-bb4a-bb216d9def25-serving-cert\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.586128 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.586916 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.587477 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-oauth-config\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.588186 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qf4bm"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.589501 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.589903 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m526v"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.590732 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m526v" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.591053 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m526v"] Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.595826 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b34aa1fd-b226-4e6d-8854-786cb7f5dc67-metrics-tls\") pod \"dns-operator-744455d44c-w58s7\" (UID: \"b34aa1fd-b226-4e6d-8854-786cb7f5dc67\") " pod="openshift-dns-operator/dns-operator-744455d44c-w58s7" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.604330 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.636998 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.644511 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.663961 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.684870 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.709317 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.723688 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.742943 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.763774 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.785979 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.804388 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.822849 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.843080 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.863276 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.884563 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.903545 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.924491 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.964113 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 09 07:48:29 crc kubenswrapper[4715]: I1009 07:48:29.984686 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.004769 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.023704 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.043676 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.064828 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.085195 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.104400 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.128235 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.144896 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.163724 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.184972 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.203901 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.224377 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.244782 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.265033 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.284366 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.304676 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.344131 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.364687 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.383711 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.404708 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.424316 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.444011 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.463855 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.482963 4715 request.go:700] Waited for 1.008653191s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.485283 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.505181 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.526901 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.566105 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tph28\" (UniqueName: \"kubernetes.io/projected/3f7b7054-dbf2-4878-9ded-127719d0afb3-kube-api-access-tph28\") pod \"openshift-config-operator-7777fb866f-z9cgk\" (UID: \"3f7b7054-dbf2-4878-9ded-127719d0afb3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.586544 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfbvh\" (UniqueName: \"kubernetes.io/projected/67298510-55be-44bd-a0d7-0988939fdf66-kube-api-access-jfbvh\") pod \"cluster-samples-operator-665b6dd947-5v4sd\" (UID: \"67298510-55be-44bd-a0d7-0988939fdf66\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.603754 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.605329 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsd9j\" (UniqueName: \"kubernetes.io/projected/d6e519e5-cb0b-40a4-a419-546ac0a3de69-kube-api-access-bsd9j\") pod \"controller-manager-879f6c89f-xf4mc\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.622238 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.624901 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.642680 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.667328 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.670609 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57xbw\" (UniqueName: \"kubernetes.io/projected/749e25cc-b0d4-42b6-831c-d6af247de9f9-kube-api-access-57xbw\") pod \"openshift-apiserver-operator-796bbdcf4f-cjrbk\" (UID: \"749e25cc-b0d4-42b6-831c-d6af247de9f9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.692156 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmq75\" (UniqueName: \"kubernetes.io/projected/93259cc2-6847-41dc-a61d-83e7b9e67f3a-kube-api-access-pmq75\") pod \"machine-api-operator-5694c8668f-5tfh2\" (UID: \"93259cc2-6847-41dc-a61d-83e7b9e67f3a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.706486 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz6qx\" (UniqueName: \"kubernetes.io/projected/3985b442-52af-4652-a129-de4aa904321f-kube-api-access-sz6qx\") pod \"apiserver-7bbb656c7d-6gwtn\" (UID: \"3985b442-52af-4652-a129-de4aa904321f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.721592 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lr8w\" (UniqueName: \"kubernetes.io/projected/bae5cd41-0015-4df3-bfe7-c2937a5938b6-kube-api-access-7lr8w\") pod \"machine-approver-56656f9798-6t7zt\" (UID: \"bae5cd41-0015-4df3-bfe7-c2937a5938b6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.724461 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.744610 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.763812 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.786891 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.806104 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.824754 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.829913 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.845713 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.853642 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.864096 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.884666 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.893202 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd"] Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.900599 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.903304 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.909538 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.910822 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk"] Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.914905 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xf4mc"] Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.918514 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" event={"ID":"bae5cd41-0015-4df3-bfe7-c2937a5938b6","Type":"ContainerStarted","Data":"1006aba501aab27176ef7dc5d30362f99763c77525bfdc6f7976a360aeb3dde2"} Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.926966 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 07:48:30 crc kubenswrapper[4715]: W1009 07:48:30.936346 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f7b7054_dbf2_4878_9ded_127719d0afb3.slice/crio-a5ae036a60afc9177a0ff7693a5f98cb262a91450a1fe9f2682d3ab583d557fb WatchSource:0}: Error finding container a5ae036a60afc9177a0ff7693a5f98cb262a91450a1fe9f2682d3ab583d557fb: Status 404 returned error can't find the container with id a5ae036a60afc9177a0ff7693a5f98cb262a91450a1fe9f2682d3ab583d557fb Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.980556 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.985880 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 09 07:48:30 crc kubenswrapper[4715]: I1009 07:48:30.990160 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 09 07:48:30 crc kubenswrapper[4715]: W1009 07:48:30.991430 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6e519e5_cb0b_40a4_a419_546ac0a3de69.slice/crio-0a6adfec2af2a3af63d4ff62fe19b60f2df022734841a166b3c60e5fbb3f2e78 WatchSource:0}: Error finding container 0a6adfec2af2a3af63d4ff62fe19b60f2df022734841a166b3c60e5fbb3f2e78: Status 404 returned error can't find the container with id 0a6adfec2af2a3af63d4ff62fe19b60f2df022734841a166b3c60e5fbb3f2e78 Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.021813 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.023656 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.047065 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.071479 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.086720 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.108406 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-5tfh2"] Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.118966 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.124384 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.143970 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.164748 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 09 07:48:31 crc kubenswrapper[4715]: W1009 07:48:31.169630 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93259cc2_6847_41dc_a61d_83e7b9e67f3a.slice/crio-8384d15f2a59e3306405f41dda2d8cdf954e85364dc60699d1f5bee2e431e194 WatchSource:0}: Error finding container 8384d15f2a59e3306405f41dda2d8cdf954e85364dc60699d1f5bee2e431e194: Status 404 returned error can't find the container with id 8384d15f2a59e3306405f41dda2d8cdf954e85364dc60699d1f5bee2e431e194 Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.184748 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.203339 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk"] Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.205748 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.215983 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn"] Oct 09 07:48:31 crc kubenswrapper[4715]: W1009 07:48:31.217130 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod749e25cc_b0d4_42b6_831c_d6af247de9f9.slice/crio-81b95c004ebeddd2aa1feb02d33865f6f768b6cfe91f1acc505e5cdd1dc62137 WatchSource:0}: Error finding container 81b95c004ebeddd2aa1feb02d33865f6f768b6cfe91f1acc505e5cdd1dc62137: Status 404 returned error can't find the container with id 81b95c004ebeddd2aa1feb02d33865f6f768b6cfe91f1acc505e5cdd1dc62137 Oct 09 07:48:31 crc kubenswrapper[4715]: W1009 07:48:31.232010 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3985b442_52af_4652_a129_de4aa904321f.slice/crio-5ae925b5c186b9fd382aba3d2ccc4f95d8b7e512f8d3278c178b772efe745c83 WatchSource:0}: Error finding container 5ae925b5c186b9fd382aba3d2ccc4f95d8b7e512f8d3278c178b772efe745c83: Status 404 returned error can't find the container with id 5ae925b5c186b9fd382aba3d2ccc4f95d8b7e512f8d3278c178b772efe745c83 Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.236827 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.243475 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.264499 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.284105 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.304163 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.323910 4715 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.344875 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.363987 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.384388 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.406797 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.446903 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qm46\" (UniqueName: \"kubernetes.io/projected/3c1c9983-60a8-4db2-866c-15deb7220cb9-kube-api-access-6qm46\") pod \"console-f9d7485db-5fdhg\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.458285 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wphrc\" (UniqueName: \"kubernetes.io/projected/7d543303-0d6d-4c3d-bb4a-bb216d9def25-kube-api-access-wphrc\") pod \"etcd-operator-b45778765-lr9f9\" (UID: \"7d543303-0d6d-4c3d-bb4a-bb216d9def25\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.479737 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb2bg\" (UniqueName: \"kubernetes.io/projected/b34aa1fd-b226-4e6d-8854-786cb7f5dc67-kube-api-access-wb2bg\") pod \"dns-operator-744455d44c-w58s7\" (UID: \"b34aa1fd-b226-4e6d-8854-786cb7f5dc67\") " pod="openshift-dns-operator/dns-operator-744455d44c-w58s7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.482645 4715 request.go:700] Waited for 1.891701363s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.484598 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.503963 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.523863 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.544297 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.613595 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18a93eea-f768-41ac-ae21-1d29a90f5f66-metrics-certs\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.613875 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-registry-certificates\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.614070 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/731cd25c-cf3d-4428-a8bd-7aa00385de1e-images\") pod \"machine-config-operator-74547568cd-qxqln\" (UID: \"731cd25c-cf3d-4428-a8bd-7aa00385de1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.614239 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bdcbb990-46a3-4a26-a68c-ee9758ef1631-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nz9jb\" (UID: \"bdcbb990-46a3-4a26-a68c-ee9758ef1631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.614398 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrlrc\" (UniqueName: \"kubernetes.io/projected/bdcbb990-46a3-4a26-a68c-ee9758ef1631-kube-api-access-xrlrc\") pod \"olm-operator-6b444d44fb-nz9jb\" (UID: \"bdcbb990-46a3-4a26-a68c-ee9758ef1631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.614576 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.614761 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.615117 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wkk9\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-kube-api-access-4wkk9\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.615631 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3647d392-87e6-4708-a6f5-060e250a71ad-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-sxdhp\" (UID: \"3647d392-87e6-4708-a6f5-060e250a71ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.615894 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34e53e6-d25e-4619-8b73-8b9486c531eb-config\") pod \"route-controller-manager-6576b87f9c-8mn46\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.616104 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/731cd25c-cf3d-4428-a8bd-7aa00385de1e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qxqln\" (UID: \"731cd25c-cf3d-4428-a8bd-7aa00385de1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.616360 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5n5d\" (UniqueName: \"kubernetes.io/projected/b75a6e94-9a8f-4789-be04-be1dabfc37c7-kube-api-access-q5n5d\") pod \"downloads-7954f5f757-sq956\" (UID: \"b75a6e94-9a8f-4789-be04-be1dabfc37c7\") " pod="openshift-console/downloads-7954f5f757-sq956" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.616596 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp8m2\" (UniqueName: \"kubernetes.io/projected/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-kube-api-access-rp8m2\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.616854 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58qzx\" (UniqueName: \"kubernetes.io/projected/3da117f6-b889-480f-b74b-5841bc551658-kube-api-access-58qzx\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.617104 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.617289 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-installation-pull-secrets\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.617541 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bdcbb990-46a3-4a26-a68c-ee9758ef1631-srv-cert\") pod \"olm-operator-6b444d44fb-nz9jb\" (UID: \"bdcbb990-46a3-4a26-a68c-ee9758ef1631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.617746 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4n7q\" (UniqueName: \"kubernetes.io/projected/f34e53e6-d25e-4619-8b73-8b9486c531eb-kube-api-access-z4n7q\") pod \"route-controller-manager-6576b87f9c-8mn46\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:31 crc kubenswrapper[4715]: E1009 07:48:31.617892 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:32.117879163 +0000 UTC m=+142.810683171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.618112 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-registry-tls\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.618371 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f34e53e6-d25e-4619-8b73-8b9486c531eb-client-ca\") pod \"route-controller-manager-6576b87f9c-8mn46\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.618673 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18a93eea-f768-41ac-ae21-1d29a90f5f66-stats-auth\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.618862 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b913217-bda5-4526-903c-9cc2df2a4815-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-trqg4\" (UID: \"3b913217-bda5-4526-903c-9cc2df2a4815\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.619078 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f34e53e6-d25e-4619-8b73-8b9486c531eb-serving-cert\") pod \"route-controller-manager-6576b87f9c-8mn46\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.619285 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f04071f-b72a-4234-92e3-1cd5e6987a58-config\") pod \"console-operator-58897d9998-dfctz\" (UID: \"2f04071f-b72a-4234-92e3-1cd5e6987a58\") " pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.619779 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-audit\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.619978 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.620158 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-audit-policies\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.620321 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.620868 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.621125 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whfwk\" (UniqueName: \"kubernetes.io/projected/3647d392-87e6-4708-a6f5-060e250a71ad-kube-api-access-whfwk\") pod \"package-server-manager-789f6589d5-sxdhp\" (UID: \"3647d392-87e6-4708-a6f5-060e250a71ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.621364 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3da117f6-b889-480f-b74b-5841bc551658-audit-dir\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.621658 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.621728 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.621912 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-config\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.622283 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.622561 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-ca-trust-extracted\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.622827 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-trusted-ca\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.623501 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.624183 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6483da2-54be-4754-9da9-7ad2af3788b3-config\") pod \"kube-controller-manager-operator-78b949d7b-9grfz\" (UID: \"f6483da2-54be-4754-9da9-7ad2af3788b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.624506 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad01aea7-211a-4ff5-b15b-fb696917dc52-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-prhvm\" (UID: \"ad01aea7-211a-4ff5-b15b-fb696917dc52\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.624689 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcx5j\" (UniqueName: \"kubernetes.io/projected/5eb07619-4575-4662-afd0-58a658ebac12-kube-api-access-jcx5j\") pod \"machine-config-controller-84d6567774-94mjv\" (UID: \"5eb07619-4575-4662-afd0-58a658ebac12\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.624893 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f04071f-b72a-4234-92e3-1cd5e6987a58-serving-cert\") pod \"console-operator-58897d9998-dfctz\" (UID: \"2f04071f-b72a-4234-92e3-1cd5e6987a58\") " pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.625218 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7kxm\" (UniqueName: \"kubernetes.io/projected/731cd25c-cf3d-4428-a8bd-7aa00385de1e-kube-api-access-m7kxm\") pod \"machine-config-operator-74547568cd-qxqln\" (UID: \"731cd25c-cf3d-4428-a8bd-7aa00385de1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.625415 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51a9931f-92cf-4ccd-a7c4-618ed079cb5b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-btd5r\" (UID: \"51a9931f-92cf-4ccd-a7c4-618ed079cb5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.625825 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8nqw\" (UniqueName: \"kubernetes.io/projected/18a93eea-f768-41ac-ae21-1d29a90f5f66-kube-api-access-k8nqw\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.626101 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkfcp\" (UniqueName: \"kubernetes.io/projected/b24ee722-8046-4655-a354-4a25a9b16b6a-kube-api-access-tkfcp\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.627146 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eb07619-4575-4662-afd0-58a658ebac12-proxy-tls\") pod \"machine-config-controller-84d6567774-94mjv\" (UID: \"5eb07619-4575-4662-afd0-58a658ebac12\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.627464 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vgfv\" (UniqueName: \"kubernetes.io/projected/2f04071f-b72a-4234-92e3-1cd5e6987a58-kube-api-access-8vgfv\") pod \"console-operator-58897d9998-dfctz\" (UID: \"2f04071f-b72a-4234-92e3-1cd5e6987a58\") " pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.627680 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b913217-bda5-4526-903c-9cc2df2a4815-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-trqg4\" (UID: \"3b913217-bda5-4526-903c-9cc2df2a4815\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.627920 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-service-ca-bundle\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.628414 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18a93eea-f768-41ac-ae21-1d29a90f5f66-default-certificate\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.628752 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-serving-cert\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.628932 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51a9931f-92cf-4ccd-a7c4-618ed079cb5b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-btd5r\" (UID: \"51a9931f-92cf-4ccd-a7c4-618ed079cb5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.629318 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-image-import-ca\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.629514 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-etcd-serving-ca\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.629721 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b913217-bda5-4526-903c-9cc2df2a4815-config\") pod \"kube-apiserver-operator-766d6c64bb-trqg4\" (UID: \"3b913217-bda5-4526-903c-9cc2df2a4815\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.630164 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b24ee722-8046-4655-a354-4a25a9b16b6a-audit-dir\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.630336 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.630790 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3da117f6-b889-480f-b74b-5841bc551658-node-pullsecrets\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.631080 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/731cd25c-cf3d-4428-a8bd-7aa00385de1e-proxy-tls\") pod \"machine-config-operator-74547568cd-qxqln\" (UID: \"731cd25c-cf3d-4428-a8bd-7aa00385de1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.631271 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-config\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.631777 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.632031 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974wk\" (UniqueName: \"kubernetes.io/projected/ad01aea7-211a-4ff5-b15b-fb696917dc52-kube-api-access-974wk\") pod \"control-plane-machine-set-operator-78cbb6b69f-prhvm\" (UID: \"ad01aea7-211a-4ff5-b15b-fb696917dc52\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.633318 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plgs5\" (UniqueName: \"kubernetes.io/projected/0e32fe67-b32a-4fe6-869a-5fe2d4877352-kube-api-access-plgs5\") pod \"openshift-controller-manager-operator-756b6f6bc6-l6l2m\" (UID: \"0e32fe67-b32a-4fe6-869a-5fe2d4877352\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.633887 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e32fe67-b32a-4fe6-869a-5fe2d4877352-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l6l2m\" (UID: \"0e32fe67-b32a-4fe6-869a-5fe2d4877352\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.635285 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18a93eea-f768-41ac-ae21-1d29a90f5f66-service-ca-bundle\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.635772 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8slg5\" (UniqueName: \"kubernetes.io/projected/51a9931f-92cf-4ccd-a7c4-618ed079cb5b-kube-api-access-8slg5\") pod \"cluster-image-registry-operator-dc59b4c8b-btd5r\" (UID: \"51a9931f-92cf-4ccd-a7c4-618ed079cb5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.635933 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f04071f-b72a-4234-92e3-1cd5e6987a58-trusted-ca\") pod \"console-operator-58897d9998-dfctz\" (UID: \"2f04071f-b72a-4234-92e3-1cd5e6987a58\") " pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.639638 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/51a9931f-92cf-4ccd-a7c4-618ed079cb5b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-btd5r\" (UID: \"51a9931f-92cf-4ccd-a7c4-618ed079cb5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.639703 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e32fe67-b32a-4fe6-869a-5fe2d4877352-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l6l2m\" (UID: \"0e32fe67-b32a-4fe6-869a-5fe2d4877352\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.639752 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6483da2-54be-4754-9da9-7ad2af3788b3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9grfz\" (UID: \"f6483da2-54be-4754-9da9-7ad2af3788b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.640059 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6483da2-54be-4754-9da9-7ad2af3788b3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9grfz\" (UID: \"f6483da2-54be-4754-9da9-7ad2af3788b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.640158 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eb07619-4575-4662-afd0-58a658ebac12-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-94mjv\" (UID: \"5eb07619-4575-4662-afd0-58a658ebac12\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.640200 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.640243 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3da117f6-b889-480f-b74b-5841bc551658-etcd-client\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.640287 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3da117f6-b889-480f-b74b-5841bc551658-encryption-config\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.640687 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-bound-sa-token\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.640716 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da117f6-b889-480f-b74b-5841bc551658-serving-cert\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.640763 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.640787 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.670100 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.730987 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w58s7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.741996 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:31 crc kubenswrapper[4715]: E1009 07:48:31.742146 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:32.242119564 +0000 UTC m=+142.934923572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742215 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-trusted-ca\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742242 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742261 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad01aea7-211a-4ff5-b15b-fb696917dc52-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-prhvm\" (UID: \"ad01aea7-211a-4ff5-b15b-fb696917dc52\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742281 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6483da2-54be-4754-9da9-7ad2af3788b3-config\") pod \"kube-controller-manager-operator-78b949d7b-9grfz\" (UID: \"f6483da2-54be-4754-9da9-7ad2af3788b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742302 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcx5j\" (UniqueName: \"kubernetes.io/projected/5eb07619-4575-4662-afd0-58a658ebac12-kube-api-access-jcx5j\") pod \"machine-config-controller-84d6567774-94mjv\" (UID: \"5eb07619-4575-4662-afd0-58a658ebac12\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742319 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f04071f-b72a-4234-92e3-1cd5e6987a58-serving-cert\") pod \"console-operator-58897d9998-dfctz\" (UID: \"2f04071f-b72a-4234-92e3-1cd5e6987a58\") " pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742349 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6tj5\" (UniqueName: \"kubernetes.io/projected/03c8f335-ee7a-4f93-9a1f-47247090dffd-kube-api-access-h6tj5\") pod \"service-ca-9c57cc56f-qf4bm\" (UID: \"03c8f335-ee7a-4f93-9a1f-47247090dffd\") " pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742365 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abd885f8-e479-48ec-9341-0acbbc3c3ea7-serving-cert\") pod \"service-ca-operator-777779d784-zrjcb\" (UID: \"abd885f8-e479-48ec-9341-0acbbc3c3ea7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742384 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7kxm\" (UniqueName: \"kubernetes.io/projected/731cd25c-cf3d-4428-a8bd-7aa00385de1e-kube-api-access-m7kxm\") pod \"machine-config-operator-74547568cd-qxqln\" (UID: \"731cd25c-cf3d-4428-a8bd-7aa00385de1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742428 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4382581-9ebd-4fef-b530-1a6d32c65d61-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-brr9q\" (UID: \"d4382581-9ebd-4fef-b530-1a6d32c65d61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742457 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51a9931f-92cf-4ccd-a7c4-618ed079cb5b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-btd5r\" (UID: \"51a9931f-92cf-4ccd-a7c4-618ed079cb5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742476 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8nqw\" (UniqueName: \"kubernetes.io/projected/18a93eea-f768-41ac-ae21-1d29a90f5f66-kube-api-access-k8nqw\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742521 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4zjq\" (UniqueName: \"kubernetes.io/projected/d4382581-9ebd-4fef-b530-1a6d32c65d61-kube-api-access-t4zjq\") pod \"kube-storage-version-migrator-operator-b67b599dd-brr9q\" (UID: \"d4382581-9ebd-4fef-b530-1a6d32c65d61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742560 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkfcp\" (UniqueName: \"kubernetes.io/projected/b24ee722-8046-4655-a354-4a25a9b16b6a-kube-api-access-tkfcp\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742592 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eb07619-4575-4662-afd0-58a658ebac12-proxy-tls\") pod \"machine-config-controller-84d6567774-94mjv\" (UID: \"5eb07619-4575-4662-afd0-58a658ebac12\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742611 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vgfv\" (UniqueName: \"kubernetes.io/projected/2f04071f-b72a-4234-92e3-1cd5e6987a58-kube-api-access-8vgfv\") pod \"console-operator-58897d9998-dfctz\" (UID: \"2f04071f-b72a-4234-92e3-1cd5e6987a58\") " pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742632 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b913217-bda5-4526-903c-9cc2df2a4815-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-trqg4\" (UID: \"3b913217-bda5-4526-903c-9cc2df2a4815\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742648 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-service-ca-bundle\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742663 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/41f83b9f-ebe1-42c4-ae44-36775e449efe-tmpfs\") pod \"packageserver-d55dfcdfc-5vwlz\" (UID: \"41f83b9f-ebe1-42c4-ae44-36775e449efe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742683 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f316b42-23b5-4041-9dc3-3b95676339e5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4gncq\" (UID: \"5f316b42-23b5-4041-9dc3-3b95676339e5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4gncq" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742698 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hzbn9\" (UID: \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742715 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lq7p\" (UniqueName: \"kubernetes.io/projected/4005d046-0643-40a6-a748-8ecacb0f1541-kube-api-access-7lq7p\") pod \"catalog-operator-68c6474976-ttk8w\" (UID: \"4005d046-0643-40a6-a748-8ecacb0f1541\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742747 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51a9931f-92cf-4ccd-a7c4-618ed079cb5b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-btd5r\" (UID: \"51a9931f-92cf-4ccd-a7c4-618ed079cb5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742764 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18a93eea-f768-41ac-ae21-1d29a90f5f66-default-certificate\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742780 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-serving-cert\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742796 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/03c8f335-ee7a-4f93-9a1f-47247090dffd-signing-key\") pod \"service-ca-9c57cc56f-qf4bm\" (UID: \"03c8f335-ee7a-4f93-9a1f-47247090dffd\") " pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742839 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-image-import-ca\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742857 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-etcd-serving-ca\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742873 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b913217-bda5-4526-903c-9cc2df2a4815-config\") pod \"kube-apiserver-operator-766d6c64bb-trqg4\" (UID: \"3b913217-bda5-4526-903c-9cc2df2a4815\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742888 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-plugins-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742907 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b24ee722-8046-4655-a354-4a25a9b16b6a-audit-dir\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742950 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5488b64-893e-49f7-9de1-99905faf0d3b-secret-volume\") pod \"collect-profiles-29333265-qxg85\" (UID: \"b5488b64-893e-49f7-9de1-99905faf0d3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742978 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3da117f6-b889-480f-b74b-5841bc551658-node-pullsecrets\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.742997 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743012 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72t4q\" (UniqueName: \"kubernetes.io/projected/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-kube-api-access-72t4q\") pod \"marketplace-operator-79b997595-hzbn9\" (UID: \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743030 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-csi-data-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743046 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/731cd25c-cf3d-4428-a8bd-7aa00385de1e-proxy-tls\") pod \"machine-config-operator-74547568cd-qxqln\" (UID: \"731cd25c-cf3d-4428-a8bd-7aa00385de1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743061 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5488b64-893e-49f7-9de1-99905faf0d3b-config-volume\") pod \"collect-profiles-29333265-qxg85\" (UID: \"b5488b64-893e-49f7-9de1-99905faf0d3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743076 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dll6k\" (UniqueName: \"kubernetes.io/projected/45129fb6-ace0-4181-b4ce-c5a7e6787606-kube-api-access-dll6k\") pod \"ingress-canary-m526v\" (UID: \"45129fb6-ace0-4181-b4ce-c5a7e6787606\") " pod="openshift-ingress-canary/ingress-canary-m526v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743091 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hzbn9\" (UID: \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743108 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbk92\" (UniqueName: \"kubernetes.io/projected/8130c7b7-7b74-461e-8348-59345d86aa6b-kube-api-access-sbk92\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743130 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-config\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743147 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slfps\" (UniqueName: \"kubernetes.io/projected/41f83b9f-ebe1-42c4-ae44-36775e449efe-kube-api-access-slfps\") pod \"packageserver-d55dfcdfc-5vwlz\" (UID: \"41f83b9f-ebe1-42c4-ae44-36775e449efe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743164 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w79tf\" (UID: \"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743182 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743198 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41f83b9f-ebe1-42c4-ae44-36775e449efe-apiservice-cert\") pod \"packageserver-d55dfcdfc-5vwlz\" (UID: \"41f83b9f-ebe1-42c4-ae44-36775e449efe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743213 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4382581-9ebd-4fef-b530-1a6d32c65d61-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-brr9q\" (UID: \"d4382581-9ebd-4fef-b530-1a6d32c65d61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743235 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974wk\" (UniqueName: \"kubernetes.io/projected/ad01aea7-211a-4ff5-b15b-fb696917dc52-kube-api-access-974wk\") pod \"control-plane-machine-set-operator-78cbb6b69f-prhvm\" (UID: \"ad01aea7-211a-4ff5-b15b-fb696917dc52\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743252 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plgs5\" (UniqueName: \"kubernetes.io/projected/0e32fe67-b32a-4fe6-869a-5fe2d4877352-kube-api-access-plgs5\") pod \"openshift-controller-manager-operator-756b6f6bc6-l6l2m\" (UID: \"0e32fe67-b32a-4fe6-869a-5fe2d4877352\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743272 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e32fe67-b32a-4fe6-869a-5fe2d4877352-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l6l2m\" (UID: \"0e32fe67-b32a-4fe6-869a-5fe2d4877352\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743291 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18a93eea-f768-41ac-ae21-1d29a90f5f66-service-ca-bundle\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743326 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b24ee722-8046-4655-a354-4a25a9b16b6a-audit-dir\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.743534 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3da117f6-b889-480f-b74b-5841bc551658-node-pullsecrets\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.744722 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6483da2-54be-4754-9da9-7ad2af3788b3-config\") pod \"kube-controller-manager-operator-78b949d7b-9grfz\" (UID: \"f6483da2-54be-4754-9da9-7ad2af3788b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.744933 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-trusted-ca\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.747323 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-image-import-ca\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.747695 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51a9931f-92cf-4ccd-a7c4-618ed079cb5b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-btd5r\" (UID: \"51a9931f-92cf-4ccd-a7c4-618ed079cb5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748235 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b913217-bda5-4526-903c-9cc2df2a4815-config\") pod \"kube-apiserver-operator-766d6c64bb-trqg4\" (UID: \"3b913217-bda5-4526-903c-9cc2df2a4815\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748446 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45129fb6-ace0-4181-b4ce-c5a7e6787606-cert\") pod \"ingress-canary-m526v\" (UID: \"45129fb6-ace0-4181-b4ce-c5a7e6787606\") " pod="openshift-ingress-canary/ingress-canary-m526v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748478 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf57q\" (UniqueName: \"kubernetes.io/projected/dec5cf82-ad38-452a-9330-cc685017bb8d-kube-api-access-vf57q\") pod \"machine-config-server-cjqqc\" (UID: \"dec5cf82-ad38-452a-9330-cc685017bb8d\") " pod="openshift-machine-config-operator/machine-config-server-cjqqc" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748506 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8slg5\" (UniqueName: \"kubernetes.io/projected/51a9931f-92cf-4ccd-a7c4-618ed079cb5b-kube-api-access-8slg5\") pod \"cluster-image-registry-operator-dc59b4c8b-btd5r\" (UID: \"51a9931f-92cf-4ccd-a7c4-618ed079cb5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748527 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/03c8f335-ee7a-4f93-9a1f-47247090dffd-signing-cabundle\") pod \"service-ca-9c57cc56f-qf4bm\" (UID: \"03c8f335-ee7a-4f93-9a1f-47247090dffd\") " pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748549 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/51a9931f-92cf-4ccd-a7c4-618ed079cb5b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-btd5r\" (UID: \"51a9931f-92cf-4ccd-a7c4-618ed079cb5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748569 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f04071f-b72a-4234-92e3-1cd5e6987a58-trusted-ca\") pod \"console-operator-58897d9998-dfctz\" (UID: \"2f04071f-b72a-4234-92e3-1cd5e6987a58\") " pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748590 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8wd8\" (UniqueName: \"kubernetes.io/projected/19912eec-ab9f-4e07-8458-6867269f1a42-kube-api-access-f8wd8\") pod \"migrator-59844c95c7-hbn7p\" (UID: \"19912eec-ab9f-4e07-8458-6867269f1a42\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hbn7p" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748608 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4005d046-0643-40a6-a748-8ecacb0f1541-srv-cert\") pod \"catalog-operator-68c6474976-ttk8w\" (UID: \"4005d046-0643-40a6-a748-8ecacb0f1541\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748640 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e32fe67-b32a-4fe6-869a-5fe2d4877352-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l6l2m\" (UID: \"0e32fe67-b32a-4fe6-869a-5fe2d4877352\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748662 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6483da2-54be-4754-9da9-7ad2af3788b3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9grfz\" (UID: \"f6483da2-54be-4754-9da9-7ad2af3788b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748681 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6483da2-54be-4754-9da9-7ad2af3788b3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9grfz\" (UID: \"f6483da2-54be-4754-9da9-7ad2af3788b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748700 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eb07619-4575-4662-afd0-58a658ebac12-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-94mjv\" (UID: \"5eb07619-4575-4662-afd0-58a658ebac12\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748719 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748748 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3da117f6-b889-480f-b74b-5841bc551658-etcd-client\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748769 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3da117f6-b889-480f-b74b-5841bc551658-encryption-config\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748798 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/76a49d30-4e29-483e-8837-f4cbcb919e06-metrics-tls\") pod \"dns-default-7wthp\" (UID: \"76a49d30-4e29-483e-8837-f4cbcb919e06\") " pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.748828 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.749013 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.749091 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18a93eea-f768-41ac-ae21-1d29a90f5f66-service-ca-bundle\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.749799 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad01aea7-211a-4ff5-b15b-fb696917dc52-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-prhvm\" (UID: \"ad01aea7-211a-4ff5-b15b-fb696917dc52\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.749871 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5eb07619-4575-4662-afd0-58a658ebac12-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-94mjv\" (UID: \"5eb07619-4575-4662-afd0-58a658ebac12\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.749873 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2f04071f-b72a-4234-92e3-1cd5e6987a58-trusted-ca\") pod \"console-operator-58897d9998-dfctz\" (UID: \"2f04071f-b72a-4234-92e3-1cd5e6987a58\") " pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.750504 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-etcd-serving-ca\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.752987 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-service-ca-bundle\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.753022 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-config\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.753237 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b913217-bda5-4526-903c-9cc2df2a4815-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-trqg4\" (UID: \"3b913217-bda5-4526-903c-9cc2df2a4815\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.753514 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5eb07619-4575-4662-afd0-58a658ebac12-proxy-tls\") pod \"machine-config-controller-84d6567774-94mjv\" (UID: \"5eb07619-4575-4662-afd0-58a658ebac12\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.753926 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f04071f-b72a-4234-92e3-1cd5e6987a58-serving-cert\") pod \"console-operator-58897d9998-dfctz\" (UID: \"2f04071f-b72a-4234-92e3-1cd5e6987a58\") " pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754089 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-serving-cert\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754546 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-bound-sa-token\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754604 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da117f6-b889-480f-b74b-5841bc551658-serving-cert\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754628 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754642 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754671 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18a93eea-f768-41ac-ae21-1d29a90f5f66-metrics-certs\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754699 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0-trusted-ca\") pod \"ingress-operator-5b745b69d9-w79tf\" (UID: \"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754719 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dec5cf82-ad38-452a-9330-cc685017bb8d-node-bootstrap-token\") pod \"machine-config-server-cjqqc\" (UID: \"dec5cf82-ad38-452a-9330-cc685017bb8d\") " pod="openshift-machine-config-operator/machine-config-server-cjqqc" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754742 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-registry-certificates\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754766 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/731cd25c-cf3d-4428-a8bd-7aa00385de1e-images\") pod \"machine-config-operator-74547568cd-qxqln\" (UID: \"731cd25c-cf3d-4428-a8bd-7aa00385de1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754785 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bdcbb990-46a3-4a26-a68c-ee9758ef1631-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nz9jb\" (UID: \"bdcbb990-46a3-4a26-a68c-ee9758ef1631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754808 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrlrc\" (UniqueName: \"kubernetes.io/projected/bdcbb990-46a3-4a26-a68c-ee9758ef1631-kube-api-access-xrlrc\") pod \"olm-operator-6b444d44fb-nz9jb\" (UID: \"bdcbb990-46a3-4a26-a68c-ee9758ef1631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754827 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tt8j\" (UniqueName: \"kubernetes.io/projected/5f316b42-23b5-4041-9dc3-3b95676339e5-kube-api-access-6tt8j\") pod \"multus-admission-controller-857f4d67dd-4gncq\" (UID: \"5f316b42-23b5-4041-9dc3-3b95676339e5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4gncq" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754863 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754884 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754942 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wkk9\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-kube-api-access-4wkk9\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754966 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bcd21f4-d2ef-4f93-8381-c85574a627e8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw6f9\" (UID: \"0bcd21f4-d2ef-4f93-8381-c85574a627e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.754992 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3647d392-87e6-4708-a6f5-060e250a71ad-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-sxdhp\" (UID: \"3647d392-87e6-4708-a6f5-060e250a71ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755013 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34e53e6-d25e-4619-8b73-8b9486c531eb-config\") pod \"route-controller-manager-6576b87f9c-8mn46\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755034 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0-metrics-tls\") pod \"ingress-operator-5b745b69d9-w79tf\" (UID: \"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755070 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/731cd25c-cf3d-4428-a8bd-7aa00385de1e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qxqln\" (UID: \"731cd25c-cf3d-4428-a8bd-7aa00385de1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755094 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h879d\" (UniqueName: \"kubernetes.io/projected/76a49d30-4e29-483e-8837-f4cbcb919e06-kube-api-access-h879d\") pod \"dns-default-7wthp\" (UID: \"76a49d30-4e29-483e-8837-f4cbcb919e06\") " pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755120 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755377 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755392 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e32fe67-b32a-4fe6-869a-5fe2d4877352-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-l6l2m\" (UID: \"0e32fe67-b32a-4fe6-869a-5fe2d4877352\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755119 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41f83b9f-ebe1-42c4-ae44-36775e449efe-webhook-cert\") pod \"packageserver-d55dfcdfc-5vwlz\" (UID: \"41f83b9f-ebe1-42c4-ae44-36775e449efe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755473 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-registration-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755497 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4005d046-0643-40a6-a748-8ecacb0f1541-profile-collector-cert\") pod \"catalog-operator-68c6474976-ttk8w\" (UID: \"4005d046-0643-40a6-a748-8ecacb0f1541\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755548 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5n5d\" (UniqueName: \"kubernetes.io/projected/b75a6e94-9a8f-4789-be04-be1dabfc37c7-kube-api-access-q5n5d\") pod \"downloads-7954f5f757-sq956\" (UID: \"b75a6e94-9a8f-4789-be04-be1dabfc37c7\") " pod="openshift-console/downloads-7954f5f757-sq956" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755578 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58qzx\" (UniqueName: \"kubernetes.io/projected/3da117f6-b889-480f-b74b-5841bc551658-kube-api-access-58qzx\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755600 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp8m2\" (UniqueName: \"kubernetes.io/projected/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-kube-api-access-rp8m2\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755652 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755677 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-installation-pull-secrets\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755697 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-mountpoint-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755728 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bdcbb990-46a3-4a26-a68c-ee9758ef1631-srv-cert\") pod \"olm-operator-6b444d44fb-nz9jb\" (UID: \"bdcbb990-46a3-4a26-a68c-ee9758ef1631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755748 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4n7q\" (UniqueName: \"kubernetes.io/projected/f34e53e6-d25e-4619-8b73-8b9486c531eb-kube-api-access-z4n7q\") pod \"route-controller-manager-6576b87f9c-8mn46\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755766 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-registry-tls\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755785 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76a49d30-4e29-483e-8837-f4cbcb919e06-config-volume\") pod \"dns-default-7wthp\" (UID: \"76a49d30-4e29-483e-8837-f4cbcb919e06\") " pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755822 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f34e53e6-d25e-4619-8b73-8b9486c531eb-client-ca\") pod \"route-controller-manager-6576b87f9c-8mn46\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755841 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnnz\" (UniqueName: \"kubernetes.io/projected/abd885f8-e479-48ec-9341-0acbbc3c3ea7-kube-api-access-plnnz\") pod \"service-ca-operator-777779d784-zrjcb\" (UID: \"abd885f8-e479-48ec-9341-0acbbc3c3ea7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755860 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dec5cf82-ad38-452a-9330-cc685017bb8d-certs\") pod \"machine-config-server-cjqqc\" (UID: \"dec5cf82-ad38-452a-9330-cc685017bb8d\") " pod="openshift-machine-config-operator/machine-config-server-cjqqc" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755882 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18a93eea-f768-41ac-ae21-1d29a90f5f66-stats-auth\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755902 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bcd21f4-d2ef-4f93-8381-c85574a627e8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw6f9\" (UID: \"0bcd21f4-d2ef-4f93-8381-c85574a627e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755920 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-audit\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755938 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755956 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b913217-bda5-4526-903c-9cc2df2a4815-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-trqg4\" (UID: \"3b913217-bda5-4526-903c-9cc2df2a4815\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755973 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f34e53e6-d25e-4619-8b73-8b9486c531eb-serving-cert\") pod \"route-controller-manager-6576b87f9c-8mn46\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.755989 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f04071f-b72a-4234-92e3-1cd5e6987a58-config\") pod \"console-operator-58897d9998-dfctz\" (UID: \"2f04071f-b72a-4234-92e3-1cd5e6987a58\") " pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756008 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbr4p\" (UniqueName: \"kubernetes.io/projected/b5488b64-893e-49f7-9de1-99905faf0d3b-kube-api-access-rbr4p\") pod \"collect-profiles-29333265-qxg85\" (UID: \"b5488b64-893e-49f7-9de1-99905faf0d3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756025 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abd885f8-e479-48ec-9341-0acbbc3c3ea7-config\") pod \"service-ca-operator-777779d784-zrjcb\" (UID: \"abd885f8-e479-48ec-9341-0acbbc3c3ea7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756045 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-audit-policies\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756061 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756080 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756097 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whfwk\" (UniqueName: \"kubernetes.io/projected/3647d392-87e6-4708-a6f5-060e250a71ad-kube-api-access-whfwk\") pod \"package-server-manager-789f6589d5-sxdhp\" (UID: \"3647d392-87e6-4708-a6f5-060e250a71ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756123 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-config\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756139 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3da117f6-b889-480f-b74b-5841bc551658-audit-dir\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756155 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756173 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bcd21f4-d2ef-4f93-8381-c85574a627e8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw6f9\" (UID: \"0bcd21f4-d2ef-4f93-8381-c85574a627e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756192 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756208 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-socket-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756251 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngfck\" (UniqueName: \"kubernetes.io/projected/2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0-kube-api-access-ngfck\") pod \"ingress-operator-5b745b69d9-w79tf\" (UID: \"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756269 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-ca-trust-extracted\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756600 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-ca-trust-extracted\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.756926 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:31 crc kubenswrapper[4715]: E1009 07:48:31.757214 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:32.257202144 +0000 UTC m=+142.950006152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.757259 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/51a9931f-92cf-4ccd-a7c4-618ed079cb5b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-btd5r\" (UID: \"51a9931f-92cf-4ccd-a7c4-618ed079cb5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.757954 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/731cd25c-cf3d-4428-a8bd-7aa00385de1e-proxy-tls\") pod \"machine-config-operator-74547568cd-qxqln\" (UID: \"731cd25c-cf3d-4428-a8bd-7aa00385de1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.757994 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/731cd25c-cf3d-4428-a8bd-7aa00385de1e-images\") pod \"machine-config-operator-74547568cd-qxqln\" (UID: \"731cd25c-cf3d-4428-a8bd-7aa00385de1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.758015 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/731cd25c-cf3d-4428-a8bd-7aa00385de1e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qxqln\" (UID: \"731cd25c-cf3d-4428-a8bd-7aa00385de1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.758050 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e32fe67-b32a-4fe6-869a-5fe2d4877352-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-l6l2m\" (UID: \"0e32fe67-b32a-4fe6-869a-5fe2d4877352\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.758932 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-audit-policies\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.760493 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-config\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.760524 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34e53e6-d25e-4619-8b73-8b9486c531eb-config\") pod \"route-controller-manager-6576b87f9c-8mn46\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.761212 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f04071f-b72a-4234-92e3-1cd5e6987a58-config\") pod \"console-operator-58897d9998-dfctz\" (UID: \"2f04071f-b72a-4234-92e3-1cd5e6987a58\") " pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.761497 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3da117f6-b889-480f-b74b-5841bc551658-audit-dir\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.762013 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.762449 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f34e53e6-d25e-4619-8b73-8b9486c531eb-client-ca\") pod \"route-controller-manager-6576b87f9c-8mn46\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.762558 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-registry-certificates\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.762659 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.763302 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-audit\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.763399 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3da117f6-b889-480f-b74b-5841bc551658-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.763723 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-registry-tls\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.765021 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.765959 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.766686 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f34e53e6-d25e-4619-8b73-8b9486c531eb-serving-cert\") pod \"route-controller-manager-6576b87f9c-8mn46\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.767387 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.768267 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bdcbb990-46a3-4a26-a68c-ee9758ef1631-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nz9jb\" (UID: \"bdcbb990-46a3-4a26-a68c-ee9758ef1631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.768477 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bdcbb990-46a3-4a26-a68c-ee9758ef1631-srv-cert\") pod \"olm-operator-6b444d44fb-nz9jb\" (UID: \"bdcbb990-46a3-4a26-a68c-ee9758ef1631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.768777 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-installation-pull-secrets\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.769495 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18a93eea-f768-41ac-ae21-1d29a90f5f66-metrics-certs\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.769990 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18a93eea-f768-41ac-ae21-1d29a90f5f66-default-certificate\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.770390 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3da117f6-b889-480f-b74b-5841bc551658-encryption-config\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.770736 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3da117f6-b889-480f-b74b-5841bc551658-serving-cert\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.774879 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18a93eea-f768-41ac-ae21-1d29a90f5f66-stats-auth\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.779165 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8nqw\" (UniqueName: \"kubernetes.io/projected/18a93eea-f768-41ac-ae21-1d29a90f5f66-kube-api-access-k8nqw\") pod \"router-default-5444994796-b4lqp\" (UID: \"18a93eea-f768-41ac-ae21-1d29a90f5f66\") " pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.779176 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6483da2-54be-4754-9da9-7ad2af3788b3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9grfz\" (UID: \"f6483da2-54be-4754-9da9-7ad2af3788b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.779604 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.780303 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3da117f6-b889-480f-b74b-5841bc551658-etcd-client\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.780691 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.789689 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3647d392-87e6-4708-a6f5-060e250a71ad-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-sxdhp\" (UID: \"3647d392-87e6-4708-a6f5-060e250a71ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.801930 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7kxm\" (UniqueName: \"kubernetes.io/projected/731cd25c-cf3d-4428-a8bd-7aa00385de1e-kube-api-access-m7kxm\") pod \"machine-config-operator-74547568cd-qxqln\" (UID: \"731cd25c-cf3d-4428-a8bd-7aa00385de1e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.815111 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.823167 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51a9931f-92cf-4ccd-a7c4-618ed079cb5b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-btd5r\" (UID: \"51a9931f-92cf-4ccd-a7c4-618ed079cb5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.844292 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcx5j\" (UniqueName: \"kubernetes.io/projected/5eb07619-4575-4662-afd0-58a658ebac12-kube-api-access-jcx5j\") pod \"machine-config-controller-84d6567774-94mjv\" (UID: \"5eb07619-4575-4662-afd0-58a658ebac12\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.851502 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-5fdhg"] Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.857174 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:31 crc kubenswrapper[4715]: E1009 07:48:31.858130 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:32.358093705 +0000 UTC m=+143.050897723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.859749 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dec5cf82-ad38-452a-9330-cc685017bb8d-certs\") pod \"machine-config-server-cjqqc\" (UID: \"dec5cf82-ad38-452a-9330-cc685017bb8d\") " pod="openshift-machine-config-operator/machine-config-server-cjqqc" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.859788 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bcd21f4-d2ef-4f93-8381-c85574a627e8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw6f9\" (UID: \"0bcd21f4-d2ef-4f93-8381-c85574a627e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.859842 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbr4p\" (UniqueName: \"kubernetes.io/projected/b5488b64-893e-49f7-9de1-99905faf0d3b-kube-api-access-rbr4p\") pod \"collect-profiles-29333265-qxg85\" (UID: \"b5488b64-893e-49f7-9de1-99905faf0d3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.859872 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abd885f8-e479-48ec-9341-0acbbc3c3ea7-config\") pod \"service-ca-operator-777779d784-zrjcb\" (UID: \"abd885f8-e479-48ec-9341-0acbbc3c3ea7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.859924 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bcd21f4-d2ef-4f93-8381-c85574a627e8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw6f9\" (UID: \"0bcd21f4-d2ef-4f93-8381-c85574a627e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.859948 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-socket-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860020 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngfck\" (UniqueName: \"kubernetes.io/projected/2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0-kube-api-access-ngfck\") pod \"ingress-operator-5b745b69d9-w79tf\" (UID: \"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860047 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6tj5\" (UniqueName: \"kubernetes.io/projected/03c8f335-ee7a-4f93-9a1f-47247090dffd-kube-api-access-h6tj5\") pod \"service-ca-9c57cc56f-qf4bm\" (UID: \"03c8f335-ee7a-4f93-9a1f-47247090dffd\") " pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860067 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abd885f8-e479-48ec-9341-0acbbc3c3ea7-serving-cert\") pod \"service-ca-operator-777779d784-zrjcb\" (UID: \"abd885f8-e479-48ec-9341-0acbbc3c3ea7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860089 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4382581-9ebd-4fef-b530-1a6d32c65d61-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-brr9q\" (UID: \"d4382581-9ebd-4fef-b530-1a6d32c65d61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860113 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4zjq\" (UniqueName: \"kubernetes.io/projected/d4382581-9ebd-4fef-b530-1a6d32c65d61-kube-api-access-t4zjq\") pod \"kube-storage-version-migrator-operator-b67b599dd-brr9q\" (UID: \"d4382581-9ebd-4fef-b530-1a6d32c65d61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860149 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/41f83b9f-ebe1-42c4-ae44-36775e449efe-tmpfs\") pod \"packageserver-d55dfcdfc-5vwlz\" (UID: \"41f83b9f-ebe1-42c4-ae44-36775e449efe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860168 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f316b42-23b5-4041-9dc3-3b95676339e5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4gncq\" (UID: \"5f316b42-23b5-4041-9dc3-3b95676339e5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4gncq" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860188 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lq7p\" (UniqueName: \"kubernetes.io/projected/4005d046-0643-40a6-a748-8ecacb0f1541-kube-api-access-7lq7p\") pod \"catalog-operator-68c6474976-ttk8w\" (UID: \"4005d046-0643-40a6-a748-8ecacb0f1541\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860210 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/03c8f335-ee7a-4f93-9a1f-47247090dffd-signing-key\") pod \"service-ca-9c57cc56f-qf4bm\" (UID: \"03c8f335-ee7a-4f93-9a1f-47247090dffd\") " pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860234 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hzbn9\" (UID: \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860690 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-plugins-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860713 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5488b64-893e-49f7-9de1-99905faf0d3b-secret-volume\") pod \"collect-profiles-29333265-qxg85\" (UID: \"b5488b64-893e-49f7-9de1-99905faf0d3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860733 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72t4q\" (UniqueName: \"kubernetes.io/projected/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-kube-api-access-72t4q\") pod \"marketplace-operator-79b997595-hzbn9\" (UID: \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860761 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-csi-data-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860778 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5488b64-893e-49f7-9de1-99905faf0d3b-config-volume\") pod \"collect-profiles-29333265-qxg85\" (UID: \"b5488b64-893e-49f7-9de1-99905faf0d3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860829 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dll6k\" (UniqueName: \"kubernetes.io/projected/45129fb6-ace0-4181-b4ce-c5a7e6787606-kube-api-access-dll6k\") pod \"ingress-canary-m526v\" (UID: \"45129fb6-ace0-4181-b4ce-c5a7e6787606\") " pod="openshift-ingress-canary/ingress-canary-m526v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860847 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hzbn9\" (UID: \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860864 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbk92\" (UniqueName: \"kubernetes.io/projected/8130c7b7-7b74-461e-8348-59345d86aa6b-kube-api-access-sbk92\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860889 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slfps\" (UniqueName: \"kubernetes.io/projected/41f83b9f-ebe1-42c4-ae44-36775e449efe-kube-api-access-slfps\") pod \"packageserver-d55dfcdfc-5vwlz\" (UID: \"41f83b9f-ebe1-42c4-ae44-36775e449efe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860907 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w79tf\" (UID: \"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860928 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41f83b9f-ebe1-42c4-ae44-36775e449efe-apiservice-cert\") pod \"packageserver-d55dfcdfc-5vwlz\" (UID: \"41f83b9f-ebe1-42c4-ae44-36775e449efe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.860947 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4382581-9ebd-4fef-b530-1a6d32c65d61-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-brr9q\" (UID: \"d4382581-9ebd-4fef-b530-1a6d32c65d61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861300 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45129fb6-ace0-4181-b4ce-c5a7e6787606-cert\") pod \"ingress-canary-m526v\" (UID: \"45129fb6-ace0-4181-b4ce-c5a7e6787606\") " pod="openshift-ingress-canary/ingress-canary-m526v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861351 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf57q\" (UniqueName: \"kubernetes.io/projected/dec5cf82-ad38-452a-9330-cc685017bb8d-kube-api-access-vf57q\") pod \"machine-config-server-cjqqc\" (UID: \"dec5cf82-ad38-452a-9330-cc685017bb8d\") " pod="openshift-machine-config-operator/machine-config-server-cjqqc" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861378 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/03c8f335-ee7a-4f93-9a1f-47247090dffd-signing-cabundle\") pod \"service-ca-9c57cc56f-qf4bm\" (UID: \"03c8f335-ee7a-4f93-9a1f-47247090dffd\") " pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861406 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8wd8\" (UniqueName: \"kubernetes.io/projected/19912eec-ab9f-4e07-8458-6867269f1a42-kube-api-access-f8wd8\") pod \"migrator-59844c95c7-hbn7p\" (UID: \"19912eec-ab9f-4e07-8458-6867269f1a42\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hbn7p" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861450 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4005d046-0643-40a6-a748-8ecacb0f1541-srv-cert\") pod \"catalog-operator-68c6474976-ttk8w\" (UID: \"4005d046-0643-40a6-a748-8ecacb0f1541\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861481 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/76a49d30-4e29-483e-8837-f4cbcb919e06-metrics-tls\") pod \"dns-default-7wthp\" (UID: \"76a49d30-4e29-483e-8837-f4cbcb919e06\") " pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861514 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0-trusted-ca\") pod \"ingress-operator-5b745b69d9-w79tf\" (UID: \"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861538 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tt8j\" (UniqueName: \"kubernetes.io/projected/5f316b42-23b5-4041-9dc3-3b95676339e5-kube-api-access-6tt8j\") pod \"multus-admission-controller-857f4d67dd-4gncq\" (UID: \"5f316b42-23b5-4041-9dc3-3b95676339e5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4gncq" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861599 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dec5cf82-ad38-452a-9330-cc685017bb8d-node-bootstrap-token\") pod \"machine-config-server-cjqqc\" (UID: \"dec5cf82-ad38-452a-9330-cc685017bb8d\") " pod="openshift-machine-config-operator/machine-config-server-cjqqc" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861633 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bcd21f4-d2ef-4f93-8381-c85574a627e8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw6f9\" (UID: \"0bcd21f4-d2ef-4f93-8381-c85574a627e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861654 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0-metrics-tls\") pod \"ingress-operator-5b745b69d9-w79tf\" (UID: \"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861675 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h879d\" (UniqueName: \"kubernetes.io/projected/76a49d30-4e29-483e-8837-f4cbcb919e06-kube-api-access-h879d\") pod \"dns-default-7wthp\" (UID: \"76a49d30-4e29-483e-8837-f4cbcb919e06\") " pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861695 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41f83b9f-ebe1-42c4-ae44-36775e449efe-webhook-cert\") pod \"packageserver-d55dfcdfc-5vwlz\" (UID: \"41f83b9f-ebe1-42c4-ae44-36775e449efe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.861724 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-registration-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.862048 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkfcp\" (UniqueName: \"kubernetes.io/projected/b24ee722-8046-4655-a354-4a25a9b16b6a-kube-api-access-tkfcp\") pod \"oauth-openshift-558db77b4-q5ck7\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.863016 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4005d046-0643-40a6-a748-8ecacb0f1541-profile-collector-cert\") pod \"catalog-operator-68c6474976-ttk8w\" (UID: \"4005d046-0643-40a6-a748-8ecacb0f1541\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.863246 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.863278 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-mountpoint-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.863347 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76a49d30-4e29-483e-8837-f4cbcb919e06-config-volume\") pod \"dns-default-7wthp\" (UID: \"76a49d30-4e29-483e-8837-f4cbcb919e06\") " pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.863385 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnnz\" (UniqueName: \"kubernetes.io/projected/abd885f8-e479-48ec-9341-0acbbc3c3ea7-kube-api-access-plnnz\") pod \"service-ca-operator-777779d784-zrjcb\" (UID: \"abd885f8-e479-48ec-9341-0acbbc3c3ea7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.863459 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/03c8f335-ee7a-4f93-9a1f-47247090dffd-signing-cabundle\") pod \"service-ca-9c57cc56f-qf4bm\" (UID: \"03c8f335-ee7a-4f93-9a1f-47247090dffd\") " pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.864049 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4382581-9ebd-4fef-b530-1a6d32c65d61-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-brr9q\" (UID: \"d4382581-9ebd-4fef-b530-1a6d32c65d61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.864936 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dec5cf82-ad38-452a-9330-cc685017bb8d-certs\") pod \"machine-config-server-cjqqc\" (UID: \"dec5cf82-ad38-452a-9330-cc685017bb8d\") " pod="openshift-machine-config-operator/machine-config-server-cjqqc" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.864938 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-registration-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.865041 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-mountpoint-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: E1009 07:48:31.865344 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:32.365323176 +0000 UTC m=+143.058127184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.865587 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abd885f8-e479-48ec-9341-0acbbc3c3ea7-serving-cert\") pod \"service-ca-operator-777779d784-zrjcb\" (UID: \"abd885f8-e479-48ec-9341-0acbbc3c3ea7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.865988 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5488b64-893e-49f7-9de1-99905faf0d3b-secret-volume\") pod \"collect-profiles-29333265-qxg85\" (UID: \"b5488b64-893e-49f7-9de1-99905faf0d3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.866064 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-csi-data-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.872526 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5488b64-893e-49f7-9de1-99905faf0d3b-config-volume\") pod \"collect-profiles-29333265-qxg85\" (UID: \"b5488b64-893e-49f7-9de1-99905faf0d3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.872594 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76a49d30-4e29-483e-8837-f4cbcb919e06-config-volume\") pod \"dns-default-7wthp\" (UID: \"76a49d30-4e29-483e-8837-f4cbcb919e06\") " pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.872765 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hzbn9\" (UID: \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.873346 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dec5cf82-ad38-452a-9330-cc685017bb8d-node-bootstrap-token\") pod \"machine-config-server-cjqqc\" (UID: \"dec5cf82-ad38-452a-9330-cc685017bb8d\") " pod="openshift-machine-config-operator/machine-config-server-cjqqc" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.873553 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-plugins-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.873647 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0-trusted-ca\") pod \"ingress-operator-5b745b69d9-w79tf\" (UID: \"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.873953 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/41f83b9f-ebe1-42c4-ae44-36775e449efe-tmpfs\") pod \"packageserver-d55dfcdfc-5vwlz\" (UID: \"41f83b9f-ebe1-42c4-ae44-36775e449efe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.874480 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/76a49d30-4e29-483e-8837-f4cbcb919e06-metrics-tls\") pod \"dns-default-7wthp\" (UID: \"76a49d30-4e29-483e-8837-f4cbcb919e06\") " pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.874504 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bcd21f4-d2ef-4f93-8381-c85574a627e8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw6f9\" (UID: \"0bcd21f4-d2ef-4f93-8381-c85574a627e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.874921 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41f83b9f-ebe1-42c4-ae44-36775e449efe-webhook-cert\") pod \"packageserver-d55dfcdfc-5vwlz\" (UID: \"41f83b9f-ebe1-42c4-ae44-36775e449efe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.874910 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45129fb6-ace0-4181-b4ce-c5a7e6787606-cert\") pod \"ingress-canary-m526v\" (UID: \"45129fb6-ace0-4181-b4ce-c5a7e6787606\") " pod="openshift-ingress-canary/ingress-canary-m526v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.875325 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0-metrics-tls\") pod \"ingress-operator-5b745b69d9-w79tf\" (UID: \"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.875557 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/03c8f335-ee7a-4f93-9a1f-47247090dffd-signing-key\") pod \"service-ca-9c57cc56f-qf4bm\" (UID: \"03c8f335-ee7a-4f93-9a1f-47247090dffd\") " pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.875576 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5f316b42-23b5-4041-9dc3-3b95676339e5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4gncq\" (UID: \"5f316b42-23b5-4041-9dc3-3b95676339e5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4gncq" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.875811 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4005d046-0643-40a6-a748-8ecacb0f1541-profile-collector-cert\") pod \"catalog-operator-68c6474976-ttk8w\" (UID: \"4005d046-0643-40a6-a748-8ecacb0f1541\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.877018 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abd885f8-e479-48ec-9341-0acbbc3c3ea7-config\") pod \"service-ca-operator-777779d784-zrjcb\" (UID: \"abd885f8-e479-48ec-9341-0acbbc3c3ea7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.877118 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bcd21f4-d2ef-4f93-8381-c85574a627e8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw6f9\" (UID: \"0bcd21f4-d2ef-4f93-8381-c85574a627e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.879598 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8130c7b7-7b74-461e-8348-59345d86aa6b-socket-dir\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.882070 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974wk\" (UniqueName: \"kubernetes.io/projected/ad01aea7-211a-4ff5-b15b-fb696917dc52-kube-api-access-974wk\") pod \"control-plane-machine-set-operator-78cbb6b69f-prhvm\" (UID: \"ad01aea7-211a-4ff5-b15b-fb696917dc52\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.882664 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hzbn9\" (UID: \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.888056 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41f83b9f-ebe1-42c4-ae44-36775e449efe-apiservice-cert\") pod \"packageserver-d55dfcdfc-5vwlz\" (UID: \"41f83b9f-ebe1-42c4-ae44-36775e449efe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.888511 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4382581-9ebd-4fef-b530-1a6d32c65d61-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-brr9q\" (UID: \"d4382581-9ebd-4fef-b530-1a6d32c65d61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.893327 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4005d046-0643-40a6-a748-8ecacb0f1541-srv-cert\") pod \"catalog-operator-68c6474976-ttk8w\" (UID: \"4005d046-0643-40a6-a748-8ecacb0f1541\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.900818 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lr9f9"] Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.906437 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plgs5\" (UniqueName: \"kubernetes.io/projected/0e32fe67-b32a-4fe6-869a-5fe2d4877352-kube-api-access-plgs5\") pod \"openshift-controller-manager-operator-756b6f6bc6-l6l2m\" (UID: \"0e32fe67-b32a-4fe6-869a-5fe2d4877352\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.923119 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8slg5\" (UniqueName: \"kubernetes.io/projected/51a9931f-92cf-4ccd-a7c4-618ed079cb5b-kube-api-access-8slg5\") pod \"cluster-image-registry-operator-dc59b4c8b-btd5r\" (UID: \"51a9931f-92cf-4ccd-a7c4-618ed079cb5b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.929035 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" event={"ID":"7d543303-0d6d-4c3d-bb4a-bb216d9def25","Type":"ContainerStarted","Data":"d45ccc131b0fe7ef47aed8b9f5864d35e0ea37978367b3df6d4c863528c25550"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.930929 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" event={"ID":"d6e519e5-cb0b-40a4-a419-546ac0a3de69","Type":"ContainerStarted","Data":"bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.930960 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" event={"ID":"d6e519e5-cb0b-40a4-a419-546ac0a3de69","Type":"ContainerStarted","Data":"0a6adfec2af2a3af63d4ff62fe19b60f2df022734841a166b3c60e5fbb3f2e78"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.931717 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.932740 4715 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xf4mc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.932773 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" podUID="d6e519e5-cb0b-40a4-a419-546ac0a3de69" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.937792 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" event={"ID":"749e25cc-b0d4-42b6-831c-d6af247de9f9","Type":"ContainerStarted","Data":"961c744fa9e5a910ad098044deee5164257e1ef716adbd4b126774632119e5da"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.937839 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" event={"ID":"749e25cc-b0d4-42b6-831c-d6af247de9f9","Type":"ContainerStarted","Data":"81b95c004ebeddd2aa1feb02d33865f6f768b6cfe91f1acc505e5cdd1dc62137"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.939942 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vgfv\" (UniqueName: \"kubernetes.io/projected/2f04071f-b72a-4234-92e3-1cd5e6987a58-kube-api-access-8vgfv\") pod \"console-operator-58897d9998-dfctz\" (UID: \"2f04071f-b72a-4234-92e3-1cd5e6987a58\") " pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.949117 4715 generic.go:334] "Generic (PLEG): container finished" podID="3f7b7054-dbf2-4878-9ded-127719d0afb3" containerID="6d197e8c8695cbf08974cb4a7cfeef95c202a355ce6b587ac8d9e05994f3b5f7" exitCode=0 Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.950746 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" event={"ID":"3f7b7054-dbf2-4878-9ded-127719d0afb3","Type":"ContainerDied","Data":"6d197e8c8695cbf08974cb4a7cfeef95c202a355ce6b587ac8d9e05994f3b5f7"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.950953 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" event={"ID":"3f7b7054-dbf2-4878-9ded-127719d0afb3","Type":"ContainerStarted","Data":"a5ae036a60afc9177a0ff7693a5f98cb262a91450a1fe9f2682d3ab583d557fb"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.950995 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.957178 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" event={"ID":"bae5cd41-0015-4df3-bfe7-c2937a5938b6","Type":"ContainerStarted","Data":"91638f88e274f0759c900a3b1abd1120acb808a07026421828be2e4fe944761d"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.957220 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" event={"ID":"bae5cd41-0015-4df3-bfe7-c2937a5938b6","Type":"ContainerStarted","Data":"91855c2393664b9b9293c75273b0074b66059194edffe4d8d068d766697906f1"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.959393 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6483da2-54be-4754-9da9-7ad2af3788b3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9grfz\" (UID: \"f6483da2-54be-4754-9da9-7ad2af3788b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.960687 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" event={"ID":"93259cc2-6847-41dc-a61d-83e7b9e67f3a","Type":"ContainerStarted","Data":"0bfc73fb86c80d38236b4a542017107f6b90957a734e0dd5b9c2f99081539b32"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.960707 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" event={"ID":"93259cc2-6847-41dc-a61d-83e7b9e67f3a","Type":"ContainerStarted","Data":"f13a03ede9ecbf2bc49c46bc7f5b33cd954be9d93124b6110c70dad2db6af5ad"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.960717 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" event={"ID":"93259cc2-6847-41dc-a61d-83e7b9e67f3a","Type":"ContainerStarted","Data":"8384d15f2a59e3306405f41dda2d8cdf954e85364dc60699d1f5bee2e431e194"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.966495 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:31 crc kubenswrapper[4715]: E1009 07:48:31.966823 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:32.466785253 +0000 UTC m=+143.159589251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.967150 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w58s7"] Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.967156 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.967500 4715 generic.go:334] "Generic (PLEG): container finished" podID="3985b442-52af-4652-a129-de4aa904321f" containerID="a30b810f656b56115a6c203b45e6b6e5f6887bf3cbd83b0faa1b5ccec67aa7a0" exitCode=0 Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.967742 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" event={"ID":"3985b442-52af-4652-a129-de4aa904321f","Type":"ContainerDied","Data":"a30b810f656b56115a6c203b45e6b6e5f6887bf3cbd83b0faa1b5ccec67aa7a0"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.967761 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" event={"ID":"3985b442-52af-4652-a129-de4aa904321f","Type":"ContainerStarted","Data":"5ae925b5c186b9fd382aba3d2ccc4f95d8b7e512f8d3278c178b772efe745c83"} Oct 09 07:48:31 crc kubenswrapper[4715]: E1009 07:48:31.968728 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:32.4687199 +0000 UTC m=+143.161523908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.969439 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd" event={"ID":"67298510-55be-44bd-a0d7-0988939fdf66","Type":"ContainerStarted","Data":"73fdcf5c99bc824d048727b52849cdfd800a7f30913d26f675598f631c4080f4"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.969458 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd" event={"ID":"67298510-55be-44bd-a0d7-0988939fdf66","Type":"ContainerStarted","Data":"62d1159be3c44465376b0f05a106a977008d76e8c2d5f8671a71f4fff79226ca"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.969468 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd" event={"ID":"67298510-55be-44bd-a0d7-0988939fdf66","Type":"ContainerStarted","Data":"5f33e42238f08455c1e44972d287d71f7ddfd14d288db688aa0dd624dda2d312"} Oct 09 07:48:31 crc kubenswrapper[4715]: I1009 07:48:31.971389 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5fdhg" event={"ID":"3c1c9983-60a8-4db2-866c-15deb7220cb9","Type":"ContainerStarted","Data":"7e1045475638b72714fbbd43e44560d678ccb4e0085a62d861bae5749886dc94"} Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.006637 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-bound-sa-token\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.021863 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrlrc\" (UniqueName: \"kubernetes.io/projected/bdcbb990-46a3-4a26-a68c-ee9758ef1631-kube-api-access-xrlrc\") pod \"olm-operator-6b444d44fb-nz9jb\" (UID: \"bdcbb990-46a3-4a26-a68c-ee9758ef1631\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.039759 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wkk9\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-kube-api-access-4wkk9\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.046102 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.055921 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.068496 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.068671 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm" Oct 09 07:48:32 crc kubenswrapper[4715]: E1009 07:48:32.070897 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:32.570870717 +0000 UTC m=+143.263674725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.077151 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.078951 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5n5d\" (UniqueName: \"kubernetes.io/projected/b75a6e94-9a8f-4789-be04-be1dabfc37c7-kube-api-access-q5n5d\") pod \"downloads-7954f5f757-sq956\" (UID: \"b75a6e94-9a8f-4789-be04-be1dabfc37c7\") " pod="openshift-console/downloads-7954f5f757-sq956" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.080748 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln"] Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.089131 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp8m2\" (UniqueName: \"kubernetes.io/projected/0d58f8de-5bf0-4b20-937a-1dbd52ed512e-kube-api-access-rp8m2\") pod \"authentication-operator-69f744f599-hbzph\" (UID: \"0d58f8de-5bf0-4b20-937a-1dbd52ed512e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.091201 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.107552 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.107934 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.116084 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58qzx\" (UniqueName: \"kubernetes.io/projected/3da117f6-b889-480f-b74b-5841bc551658-kube-api-access-58qzx\") pod \"apiserver-76f77b778f-xh68m\" (UID: \"3da117f6-b889-480f-b74b-5841bc551658\") " pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.147293 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4n7q\" (UniqueName: \"kubernetes.io/projected/f34e53e6-d25e-4619-8b73-8b9486c531eb-kube-api-access-z4n7q\") pod \"route-controller-manager-6576b87f9c-8mn46\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.149232 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whfwk\" (UniqueName: \"kubernetes.io/projected/3647d392-87e6-4708-a6f5-060e250a71ad-kube-api-access-whfwk\") pod \"package-server-manager-789f6589d5-sxdhp\" (UID: \"3647d392-87e6-4708-a6f5-060e250a71ad\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.171674 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b913217-bda5-4526-903c-9cc2df2a4815-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-trqg4\" (UID: \"3b913217-bda5-4526-903c-9cc2df2a4815\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.172985 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:32 crc kubenswrapper[4715]: E1009 07:48:32.174474 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:32.674455477 +0000 UTC m=+143.367259485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.174861 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.181172 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.188030 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbr4p\" (UniqueName: \"kubernetes.io/projected/b5488b64-893e-49f7-9de1-99905faf0d3b-kube-api-access-rbr4p\") pod \"collect-profiles-29333265-qxg85\" (UID: \"b5488b64-893e-49f7-9de1-99905faf0d3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.197239 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q5ck7"] Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.199041 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.206458 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bcd21f4-d2ef-4f93-8381-c85574a627e8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sw6f9\" (UID: \"0bcd21f4-d2ef-4f93-8381-c85574a627e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.220618 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4zjq\" (UniqueName: \"kubernetes.io/projected/d4382581-9ebd-4fef-b530-1a6d32c65d61-kube-api-access-t4zjq\") pod \"kube-storage-version-migrator-operator-b67b599dd-brr9q\" (UID: \"d4382581-9ebd-4fef-b530-1a6d32c65d61\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.231531 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.246285 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8wd8\" (UniqueName: \"kubernetes.io/projected/19912eec-ab9f-4e07-8458-6867269f1a42-kube-api-access-f8wd8\") pod \"migrator-59844c95c7-hbn7p\" (UID: \"19912eec-ab9f-4e07-8458-6867269f1a42\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hbn7p" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.254065 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.266166 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnnz\" (UniqueName: \"kubernetes.io/projected/abd885f8-e479-48ec-9341-0acbbc3c3ea7-kube-api-access-plnnz\") pod \"service-ca-operator-777779d784-zrjcb\" (UID: \"abd885f8-e479-48ec-9341-0acbbc3c3ea7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.267372 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-sq956" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.274385 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:32 crc kubenswrapper[4715]: E1009 07:48:32.274932 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:32.774912865 +0000 UTC m=+143.467716873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.278394 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.291263 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h879d\" (UniqueName: \"kubernetes.io/projected/76a49d30-4e29-483e-8837-f4cbcb919e06-kube-api-access-h879d\") pod \"dns-default-7wthp\" (UID: \"76a49d30-4e29-483e-8837-f4cbcb919e06\") " pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.314347 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w79tf\" (UID: \"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.326964 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbk92\" (UniqueName: \"kubernetes.io/projected/8130c7b7-7b74-461e-8348-59345d86aa6b-kube-api-access-sbk92\") pod \"csi-hostpathplugin-575dw\" (UID: \"8130c7b7-7b74-461e-8348-59345d86aa6b\") " pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.349223 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slfps\" (UniqueName: \"kubernetes.io/projected/41f83b9f-ebe1-42c4-ae44-36775e449efe-kube-api-access-slfps\") pod \"packageserver-d55dfcdfc-5vwlz\" (UID: \"41f83b9f-ebe1-42c4-ae44-36775e449efe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.377580 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:32 crc kubenswrapper[4715]: E1009 07:48:32.377992 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:32.8779788 +0000 UTC m=+143.570782808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.381186 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dll6k\" (UniqueName: \"kubernetes.io/projected/45129fb6-ace0-4181-b4ce-c5a7e6787606-kube-api-access-dll6k\") pod \"ingress-canary-m526v\" (UID: \"45129fb6-ace0-4181-b4ce-c5a7e6787606\") " pod="openshift-ingress-canary/ingress-canary-m526v" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.384611 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tt8j\" (UniqueName: \"kubernetes.io/projected/5f316b42-23b5-4041-9dc3-3b95676339e5-kube-api-access-6tt8j\") pod \"multus-admission-controller-857f4d67dd-4gncq\" (UID: \"5f316b42-23b5-4041-9dc3-3b95676339e5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4gncq" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.394837 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m"] Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.406528 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf57q\" (UniqueName: \"kubernetes.io/projected/dec5cf82-ad38-452a-9330-cc685017bb8d-kube-api-access-vf57q\") pod \"machine-config-server-cjqqc\" (UID: \"dec5cf82-ad38-452a-9330-cc685017bb8d\") " pod="openshift-machine-config-operator/machine-config-server-cjqqc" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.425231 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.429334 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72t4q\" (UniqueName: \"kubernetes.io/projected/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-kube-api-access-72t4q\") pod \"marketplace-operator-79b997595-hzbn9\" (UID: \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.437564 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hbn7p" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.452097 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.452320 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4gncq" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.456464 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6tj5\" (UniqueName: \"kubernetes.io/projected/03c8f335-ee7a-4f93-9a1f-47247090dffd-kube-api-access-h6tj5\") pod \"service-ca-9c57cc56f-qf4bm\" (UID: \"03c8f335-ee7a-4f93-9a1f-47247090dffd\") " pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.464220 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngfck\" (UniqueName: \"kubernetes.io/projected/2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0-kube-api-access-ngfck\") pod \"ingress-operator-5b745b69d9-w79tf\" (UID: \"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.475233 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.478588 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:32 crc kubenswrapper[4715]: E1009 07:48:32.479297 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:32.979266252 +0000 UTC m=+143.672070260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.479381 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.484434 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r"] Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.491675 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.492089 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lq7p\" (UniqueName: \"kubernetes.io/projected/4005d046-0643-40a6-a748-8ecacb0f1541-kube-api-access-7lq7p\") pod \"catalog-operator-68c6474976-ttk8w\" (UID: \"4005d046-0643-40a6-a748-8ecacb0f1541\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.510748 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.523334 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.528004 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz"] Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.530920 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.545975 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cjqqc" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.562393 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-575dw" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.573066 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.583312 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.583768 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m526v" Oct 09 07:48:32 crc kubenswrapper[4715]: E1009 07:48:32.583797 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:33.08377989 +0000 UTC m=+143.776583898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.602145 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm"] Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.684955 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:32 crc kubenswrapper[4715]: E1009 07:48:32.685374 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:33.18535869 +0000 UTC m=+143.878162698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.765001 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.805450 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:32 crc kubenswrapper[4715]: E1009 07:48:32.805955 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:33.305941895 +0000 UTC m=+143.998745903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.887011 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv"] Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.899899 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb"] Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.910180 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:32 crc kubenswrapper[4715]: E1009 07:48:32.910483 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:33.410453702 +0000 UTC m=+144.103257710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:32 crc kubenswrapper[4715]: I1009 07:48:32.912477 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:32 crc kubenswrapper[4715]: E1009 07:48:32.912984 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:33.412964435 +0000 UTC m=+144.105768443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.015108 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:33 crc kubenswrapper[4715]: E1009 07:48:33.015493 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:33.515476613 +0000 UTC m=+144.208280621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.040614 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" event={"ID":"7d543303-0d6d-4c3d-bb4a-bb216d9def25","Type":"ContainerStarted","Data":"e4a32772fd892a8caf70eb607af3f24febcda36f7011c1a841d74346b7642e35"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.047563 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5fdhg" event={"ID":"3c1c9983-60a8-4db2-866c-15deb7220cb9","Type":"ContainerStarted","Data":"dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.063742 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-sq956"] Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.069983 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" event={"ID":"3f7b7054-dbf2-4878-9ded-127719d0afb3","Type":"ContainerStarted","Data":"740c33337089cb4c67c2cc95679264df9d6d546797fd7988b61a843e40fbc95b"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.070055 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.072069 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" event={"ID":"0e32fe67-b32a-4fe6-869a-5fe2d4877352","Type":"ContainerStarted","Data":"8eeb710255fe79a76d5c13fdf4d89c929f115452e4ea5c8c8c666ab457d63398"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.073406 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46"] Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.079174 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" event={"ID":"51a9931f-92cf-4ccd-a7c4-618ed079cb5b","Type":"ContainerStarted","Data":"b38dd0e51c5c5ee328765b50db655890a676e74e257b99c223b331935c91cad3"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.079211 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" event={"ID":"f6483da2-54be-4754-9da9-7ad2af3788b3","Type":"ContainerStarted","Data":"a8e139ec387ec52b14a68c4325ac4256db39a51abf6705aa5917401048f9e830"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.082289 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" event={"ID":"b24ee722-8046-4655-a354-4a25a9b16b6a","Type":"ContainerStarted","Data":"d1635420df7b0c74e79b0dd52c405e76917a494a39be592acb58e20267556b73"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.087350 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-b4lqp" event={"ID":"18a93eea-f768-41ac-ae21-1d29a90f5f66","Type":"ContainerStarted","Data":"1a8395ec8a7d84acd5aeb61f5c59e6388990a94cb91add27de15ecc70819a45a"} Oct 09 07:48:33 crc kubenswrapper[4715]: E1009 07:48:33.136335 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:33.636312425 +0000 UTC m=+144.329116433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.135756 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.139555 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" event={"ID":"3985b442-52af-4652-a129-de4aa904321f","Type":"ContainerStarted","Data":"42ea67a582431069fa757f287792e2fa52fc270f6ad47e9ae9b49f3c73ac5dc1"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.141336 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" event={"ID":"731cd25c-cf3d-4428-a8bd-7aa00385de1e","Type":"ContainerStarted","Data":"38e7117e55d40b9c4167769270b4813a400bd813ad897795af4ea6361d7b8e89"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.141375 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" event={"ID":"731cd25c-cf3d-4428-a8bd-7aa00385de1e","Type":"ContainerStarted","Data":"d5eec8e42e980d8657ddb9ae7387a374f016081f9a4371a47ec28815f80c2c4c"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.143102 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cjqqc" event={"ID":"dec5cf82-ad38-452a-9330-cc685017bb8d","Type":"ContainerStarted","Data":"936161a4c7cd1289b2e176db0e50d6332b8f9082d5152d387a82b6f356d0776b"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.150667 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w58s7" event={"ID":"b34aa1fd-b226-4e6d-8854-786cb7f5dc67","Type":"ContainerStarted","Data":"82081eaf00e336957763e5fa8ceec453c008e2a631a2c5710170feed158fc612"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.150721 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w58s7" event={"ID":"b34aa1fd-b226-4e6d-8854-786cb7f5dc67","Type":"ContainerStarted","Data":"71faf4bf24776709d0f21d3508bdca873e12c873df7e70f48f172dd62270b089"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.153469 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm" event={"ID":"ad01aea7-211a-4ff5-b15b-fb696917dc52","Type":"ContainerStarted","Data":"ae8076f7bc629dafc85162ccb88842c27d7d35724aa65397944c08991e8baa9b"} Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.204885 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-lr9f9" podStartSLOduration=122.204865693 podStartE2EDuration="2m2.204865693s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:33.203773241 +0000 UTC m=+143.896577249" watchObservedRunningTime="2025-10-09 07:48:33.204865693 +0000 UTC m=+143.897669701" Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.205550 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.246555 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:33 crc kubenswrapper[4715]: E1009 07:48:33.278561 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:33.778412527 +0000 UTC m=+144.471216535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.330285 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-5v4sd" podStartSLOduration=122.330266888 podStartE2EDuration="2m2.330266888s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:33.277257773 +0000 UTC m=+143.970061781" watchObservedRunningTime="2025-10-09 07:48:33.330266888 +0000 UTC m=+144.023070896" Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.334210 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-dfctz"] Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.335426 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-hbzph"] Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.337704 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85"] Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.362490 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4"] Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.372812 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:33 crc kubenswrapper[4715]: E1009 07:48:33.373226 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:33.87320875 +0000 UTC m=+144.566012768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.478352 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:33 crc kubenswrapper[4715]: E1009 07:48:33.479101 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:33.979081026 +0000 UTC m=+144.671885024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.587949 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:33 crc kubenswrapper[4715]: E1009 07:48:33.588363 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:34.088346191 +0000 UTC m=+144.781150199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.643193 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6t7zt" podStartSLOduration=122.643163859 podStartE2EDuration="2m2.643163859s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:33.640767169 +0000 UTC m=+144.333571187" watchObservedRunningTime="2025-10-09 07:48:33.643163859 +0000 UTC m=+144.335967867" Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.696255 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:33 crc kubenswrapper[4715]: E1009 07:48:33.696497 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:34.196464473 +0000 UTC m=+144.889268481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.696623 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:33 crc kubenswrapper[4715]: E1009 07:48:33.697075 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:34.19705173 +0000 UTC m=+144.889855738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.798510 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:33 crc kubenswrapper[4715]: E1009 07:48:33.799633 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:34.299614799 +0000 UTC m=+144.992418807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.872188 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp"] Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.912404 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:33 crc kubenswrapper[4715]: E1009 07:48:33.912942 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:34.412904442 +0000 UTC m=+145.105708450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.932264 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cjrbk" podStartSLOduration=122.932244795 podStartE2EDuration="2m2.932244795s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:33.882120164 +0000 UTC m=+144.574924172" watchObservedRunningTime="2025-10-09 07:48:33.932244795 +0000 UTC m=+144.625048803" Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.933319 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" podStartSLOduration=122.933313207 podStartE2EDuration="2m2.933313207s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:33.930941297 +0000 UTC m=+144.623745305" watchObservedRunningTime="2025-10-09 07:48:33.933313207 +0000 UTC m=+144.626117215" Oct 09 07:48:33 crc kubenswrapper[4715]: I1009 07:48:33.976714 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xh68m"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.015292 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:34 crc kubenswrapper[4715]: E1009 07:48:34.015698 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:34.515680408 +0000 UTC m=+145.208484416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.125178 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-5tfh2" podStartSLOduration=123.125156519 podStartE2EDuration="2m3.125156519s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:34.124043726 +0000 UTC m=+144.816847734" watchObservedRunningTime="2025-10-09 07:48:34.125156519 +0000 UTC m=+144.817960527" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.131307 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.133050 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:34 crc kubenswrapper[4715]: E1009 07:48:34.140930 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:34.640890007 +0000 UTC m=+145.333694015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.224705 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-5fdhg" podStartSLOduration=123.22467028 podStartE2EDuration="2m3.22467028s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:34.163212198 +0000 UTC m=+144.856016206" watchObservedRunningTime="2025-10-09 07:48:34.22467028 +0000 UTC m=+144.917474288" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.267211 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:34 crc kubenswrapper[4715]: E1009 07:48:34.267670 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:34.767647132 +0000 UTC m=+145.460451140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.279803 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.279851 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-sq956" event={"ID":"b75a6e94-9a8f-4789-be04-be1dabfc37c7","Type":"ContainerStarted","Data":"d9910e76efa674fce5858655e9a2fc1c66220c21429a1cc125e09a8ef3cb89dc"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.280022 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" event={"ID":"b24ee722-8046-4655-a354-4a25a9b16b6a","Type":"ContainerStarted","Data":"05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.280828 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.302918 4715 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-q5ck7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.302958 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" podUID="b24ee722-8046-4655-a354-4a25a9b16b6a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.306872 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" event={"ID":"b5488b64-893e-49f7-9de1-99905faf0d3b","Type":"ContainerStarted","Data":"81695112128c47caf200c552442949a79c2ac56ac1dec50c669cb4bfbd24f596"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.310981 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" podStartSLOduration=123.310965965 podStartE2EDuration="2m3.310965965s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:34.308007449 +0000 UTC m=+145.000811457" watchObservedRunningTime="2025-10-09 07:48:34.310965965 +0000 UTC m=+145.003769983" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.334991 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" event={"ID":"3647d392-87e6-4708-a6f5-060e250a71ad","Type":"ContainerStarted","Data":"ce901ec4ae2dc2aec3972a37108ff2528a55c2d831a4f8cfdf7a0972d2b7b797"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.359727 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" event={"ID":"3b913217-bda5-4526-903c-9cc2df2a4815","Type":"ContainerStarted","Data":"7c6877526e542bbfbe17bc86f72811f0de19d33e17eebcbd6d37f10d55db7c97"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.369044 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:34 crc kubenswrapper[4715]: E1009 07:48:34.383357 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:34.883334695 +0000 UTC m=+145.576138703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.406843 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" event={"ID":"0e32fe67-b32a-4fe6-869a-5fe2d4877352","Type":"ContainerStarted","Data":"38c28b2fbc72d2017aa1a3bb254abdddad3bef3cd7cb91998d4ff235c5da3e48"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.411812 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" podStartSLOduration=123.411784914 podStartE2EDuration="2m3.411784914s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:34.410811846 +0000 UTC m=+145.103615854" watchObservedRunningTime="2025-10-09 07:48:34.411784914 +0000 UTC m=+145.104588922" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.422162 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" event={"ID":"0d58f8de-5bf0-4b20-937a-1dbd52ed512e","Type":"ContainerStarted","Data":"6554ad083a5eae1aa5f542cc75de97ead569ce222224dde0be6de7b6bff2beba"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.443748 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" event={"ID":"d4382581-9ebd-4fef-b530-1a6d32c65d61","Type":"ContainerStarted","Data":"1072162302e5bc1330f10c54f42384d5966447d430d7bb665f8d004923adecf2"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.454495 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4gncq"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.458014 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" event={"ID":"f34e53e6-d25e-4619-8b73-8b9486c531eb","Type":"ContainerStarted","Data":"aa6f34a7d3fa206154269554e3bc8d8ace43d9e7f459eab8e803ff51ea7defb6"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.458813 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.460678 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" event={"ID":"5eb07619-4575-4662-afd0-58a658ebac12","Type":"ContainerStarted","Data":"dac8705bd95036c1bf7a854bf8164d7a8a76aeaa2cae30c5970044f39d6cc185"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.461009 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" podStartSLOduration=123.460999348 podStartE2EDuration="2m3.460999348s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:34.441402487 +0000 UTC m=+145.134206495" watchObservedRunningTime="2025-10-09 07:48:34.460999348 +0000 UTC m=+145.153803356" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.465165 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-qf4bm"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.466515 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-l6l2m" podStartSLOduration=123.466506119 podStartE2EDuration="2m3.466506119s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:34.457376093 +0000 UTC m=+145.150180101" watchObservedRunningTime="2025-10-09 07:48:34.466506119 +0000 UTC m=+145.159310127" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.471230 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:34 crc kubenswrapper[4715]: E1009 07:48:34.471454 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:34.971428313 +0000 UTC m=+145.664232321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.471704 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:34 crc kubenswrapper[4715]: E1009 07:48:34.472239 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:34.972222356 +0000 UTC m=+145.665026364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.479284 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-b4lqp" event={"ID":"18a93eea-f768-41ac-ae21-1d29a90f5f66","Type":"ContainerStarted","Data":"1d1568ea95ee684c08e252028f84359395d403e78eec66ee1759569ed1e343c0"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.480922 4715 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-8mn46 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.481030 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" podUID="f34e53e6-d25e-4619-8b73-8b9486c531eb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.499781 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" event={"ID":"bdcbb990-46a3-4a26-a68c-ee9758ef1631","Type":"ContainerStarted","Data":"52240d094236e7dd03524859faae5a6f03589bb8aaa99d9d5c5236183a772123"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.532772 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" event={"ID":"51a9931f-92cf-4ccd-a7c4-618ed079cb5b","Type":"ContainerStarted","Data":"4736b9890bb3d384c2775765ff2ee41ad1bfdf08f38483146ceda70d8e5b14c8"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.540281 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xh68m" event={"ID":"3da117f6-b889-480f-b74b-5841bc551658","Type":"ContainerStarted","Data":"dacf41a0019ef4b99f21a37270e18d994f26caeb8fc2d25248a74ffe3baae753"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.555308 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" podStartSLOduration=123.555286777 podStartE2EDuration="2m3.555286777s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:34.505479795 +0000 UTC m=+145.198283803" watchObservedRunningTime="2025-10-09 07:48:34.555286777 +0000 UTC m=+145.248090785" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.558930 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-dfctz" event={"ID":"2f04071f-b72a-4234-92e3-1cd5e6987a58","Type":"ContainerStarted","Data":"80980bb812dc8763a2d8f1350a8d6b4e8d9c202ace6e7f74e924936e1c1d2dd1"} Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.573040 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:34 crc kubenswrapper[4715]: E1009 07:48:34.574024 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:35.073986502 +0000 UTC m=+145.766790520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.598411 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-b4lqp" podStartSLOduration=123.598384903 podStartE2EDuration="2m3.598384903s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:34.563257969 +0000 UTC m=+145.256061987" watchObservedRunningTime="2025-10-09 07:48:34.598384903 +0000 UTC m=+145.291188911" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.603586 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.620061 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-btd5r" podStartSLOduration=123.620026734 podStartE2EDuration="2m3.620026734s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:34.606916812 +0000 UTC m=+145.299720820" watchObservedRunningTime="2025-10-09 07:48:34.620026734 +0000 UTC m=+145.312830742" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.635241 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m526v"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.678557 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:34 crc kubenswrapper[4715]: E1009 07:48:34.683152 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:35.183127424 +0000 UTC m=+145.875931432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.723119 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.732960 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-dfctz" podStartSLOduration=123.732932666 podStartE2EDuration="2m3.732932666s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:34.658607229 +0000 UTC m=+145.351411237" watchObservedRunningTime="2025-10-09 07:48:34.732932666 +0000 UTC m=+145.425736674" Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.802188 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-575dw"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.809185 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:34 crc kubenswrapper[4715]: E1009 07:48:34.809526 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:35.309509248 +0000 UTC m=+146.002313256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.820532 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7wthp"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.824235 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hbn7p"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.831471 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hzbn9"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.831616 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9"] Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.836854 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w"] Oct 09 07:48:34 crc kubenswrapper[4715]: W1009 07:48:34.898492 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8130c7b7_7b74_461e_8348_59345d86aa6b.slice/crio-17c1c0c6b0433cf7cdf1222abf1a28180ab61acc6dd962af5da542ec5b047e6a WatchSource:0}: Error finding container 17c1c0c6b0433cf7cdf1222abf1a28180ab61acc6dd962af5da542ec5b047e6a: Status 404 returned error can't find the container with id 17c1c0c6b0433cf7cdf1222abf1a28180ab61acc6dd962af5da542ec5b047e6a Oct 09 07:48:34 crc kubenswrapper[4715]: I1009 07:48:34.910092 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:34 crc kubenswrapper[4715]: E1009 07:48:34.910434 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:35.410404979 +0000 UTC m=+146.103208987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.011611 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.012447 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:35.512403402 +0000 UTC m=+146.205207420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.082000 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.094684 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:35 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:35 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:35 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.094739 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.121171 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.121666 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:35.621650007 +0000 UTC m=+146.314454015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.223292 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.223548 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:35.723514796 +0000 UTC m=+146.416318804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.223642 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.223994 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:35.72398092 +0000 UTC m=+146.416784918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.326372 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.326953 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:35.82691621 +0000 UTC m=+146.519720218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.327232 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.327701 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:35.827693423 +0000 UTC m=+146.520497431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.431683 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.431870 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:35.931843609 +0000 UTC m=+146.624647617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.435150 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.435638 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:35.935624229 +0000 UTC m=+146.628428237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.543008 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.543431 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:36.043400361 +0000 UTC m=+146.736204369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.603562 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" event={"ID":"f34e53e6-d25e-4619-8b73-8b9486c531eb","Type":"ContainerStarted","Data":"5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.604751 4715 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-8mn46 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.604812 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" podUID="f34e53e6-d25e-4619-8b73-8b9486c531eb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.607373 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" event={"ID":"4005d046-0643-40a6-a748-8ecacb0f1541","Type":"ContainerStarted","Data":"6443a82fc3eec5a762b348da4259f5674e8de1eda03d141238b298c631c19848"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.633222 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" event={"ID":"5eb07619-4575-4662-afd0-58a658ebac12","Type":"ContainerStarted","Data":"3ddc347cbc2afa0bf3ef1a7191311a2f1ef6a7e173c5b7e334ff0ec2d551820f"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.633275 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" event={"ID":"5eb07619-4575-4662-afd0-58a658ebac12","Type":"ContainerStarted","Data":"9e9029a2d7d421249bbb9640a66982a2888ffb3e436ac1cc9c1244fbfcdc92e3"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.649329 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.649677 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:36.149664768 +0000 UTC m=+146.842468766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.656322 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" event={"ID":"3647d392-87e6-4708-a6f5-060e250a71ad","Type":"ContainerStarted","Data":"efccb183ae5791cf60d986a1b82659d7d8fdb7269a17168e9b5da25b6b913bb1"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.656366 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" event={"ID":"3647d392-87e6-4708-a6f5-060e250a71ad","Type":"ContainerStarted","Data":"046972206b398f6fdf45de705233089ea47093bd8376dbe98cf98e90478e460d"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.657092 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.680598 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cjqqc" event={"ID":"dec5cf82-ad38-452a-9330-cc685017bb8d","Type":"ContainerStarted","Data":"ab66e48d4a8b28d95072498f7d4869588a26b1d26e32d0d3a16fe7f6dd0dd406"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.682895 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-94mjv" podStartSLOduration=124.682876076 podStartE2EDuration="2m4.682876076s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:35.682154135 +0000 UTC m=+146.374958153" watchObservedRunningTime="2025-10-09 07:48:35.682876076 +0000 UTC m=+146.375680084" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.709505 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-575dw" event={"ID":"8130c7b7-7b74-461e-8348-59345d86aa6b","Type":"ContainerStarted","Data":"17c1c0c6b0433cf7cdf1222abf1a28180ab61acc6dd962af5da542ec5b047e6a"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.712110 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" podStartSLOduration=124.712093628 podStartE2EDuration="2m4.712093628s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:35.711706207 +0000 UTC m=+146.404510225" watchObservedRunningTime="2025-10-09 07:48:35.712093628 +0000 UTC m=+146.404897636" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.745342 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-cjqqc" podStartSLOduration=6.745322807 podStartE2EDuration="6.745322807s" podCreationTimestamp="2025-10-09 07:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:35.744520613 +0000 UTC m=+146.437324621" watchObservedRunningTime="2025-10-09 07:48:35.745322807 +0000 UTC m=+146.438126815" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.746613 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" event={"ID":"0bcd21f4-d2ef-4f93-8381-c85574a627e8","Type":"ContainerStarted","Data":"90d77633b5d873480e6f22ef7de446f9bcd8051cb3987e6971087f6bf72b4166"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.750830 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.757894 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:36.257864052 +0000 UTC m=+146.950668120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.765122 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m526v" event={"ID":"45129fb6-ace0-4181-b4ce-c5a7e6787606","Type":"ContainerStarted","Data":"ff269a5e51f22045c78d949f1776d75705adb4060b83790bda6effe5faad025a"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.765189 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m526v" event={"ID":"45129fb6-ace0-4181-b4ce-c5a7e6787606","Type":"ContainerStarted","Data":"75a8d8e60f8e129124627a66952b8558e272e191ef37f5ad22ba07f9e0aa16ae"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.780234 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" event={"ID":"d4382581-9ebd-4fef-b530-1a6d32c65d61","Type":"ContainerStarted","Data":"3c1c69da132b6864d0551cea0238928ba16bb6c8d46f7b078933e267f681716e"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.813225 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m526v" podStartSLOduration=6.813185545 podStartE2EDuration="6.813185545s" podCreationTimestamp="2025-10-09 07:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:35.810140186 +0000 UTC m=+146.502944194" watchObservedRunningTime="2025-10-09 07:48:35.813185545 +0000 UTC m=+146.505989553" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.834163 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm" event={"ID":"ad01aea7-211a-4ff5-b15b-fb696917dc52","Type":"ContainerStarted","Data":"f997a27889bd7ae442d3b776f865afba0c284677b51b8425271634f6e8ffb124"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.836581 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-brr9q" podStartSLOduration=124.836557486 podStartE2EDuration="2m4.836557486s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:35.834995911 +0000 UTC m=+146.527799919" watchObservedRunningTime="2025-10-09 07:48:35.836557486 +0000 UTC m=+146.529361494" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.842698 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4gncq" event={"ID":"5f316b42-23b5-4041-9dc3-3b95676339e5","Type":"ContainerStarted","Data":"f9481b411e0980b84e9677857aa2e9771ad6ac4b2a7c129a8c443c3392fbda88"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.857740 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.858274 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:36.358261579 +0000 UTC m=+147.051065587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.865410 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" event={"ID":"731cd25c-cf3d-4428-a8bd-7aa00385de1e","Type":"ContainerStarted","Data":"bfd20720bf7a7df374a1c36d4e946b417f36199bdb26336fb76a7f93442117a6"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.909949 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.910010 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.933586 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hbn7p" event={"ID":"19912eec-ab9f-4e07-8458-6867269f1a42","Type":"ContainerStarted","Data":"2ed2d7d82b78b7bcb375ef5075bb14ba5bf9833c431d025544c3e1752f069739"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.939561 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" event={"ID":"0d58f8de-5bf0-4b20-937a-1dbd52ed512e","Type":"ContainerStarted","Data":"2630dbe4f289744cb4c5d9e3b703a1837a7849cd3b9ba837f525a1926d4efe99"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.940073 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-prhvm" podStartSLOduration=124.940043543 podStartE2EDuration="2m4.940043543s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:35.879039925 +0000 UTC m=+146.571843933" watchObservedRunningTime="2025-10-09 07:48:35.940043543 +0000 UTC m=+146.632847561" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.942464 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qxqln" podStartSLOduration=124.942409032 podStartE2EDuration="2m4.942409032s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:35.941047702 +0000 UTC m=+146.633851710" watchObservedRunningTime="2025-10-09 07:48:35.942409032 +0000 UTC m=+146.635213040" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.945567 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7wthp" event={"ID":"76a49d30-4e29-483e-8837-f4cbcb919e06","Type":"ContainerStarted","Data":"ad437cc456dda81336eaadca053a6b5032ac326567a748f24aff316aab822dab"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.946740 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" event={"ID":"3b913217-bda5-4526-903c-9cc2df2a4815","Type":"ContainerStarted","Data":"07a24ac22379e828d48e4770aadd7a9ff01b519c7805f29559b9ba0a55a9aa8e"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.949976 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-dfctz" event={"ID":"2f04071f-b72a-4234-92e3-1cd5e6987a58","Type":"ContainerStarted","Data":"1e518bfb4306fb94e840b16dba72a0310adb115a9db4c7936179e96486cbc614"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.950620 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.952587 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w58s7" event={"ID":"b34aa1fd-b226-4e6d-8854-786cb7f5dc67","Type":"ContainerStarted","Data":"4ddecc79a8653b9e121376dfbcdd029a3e2c6df9cd334055a11fdbb3a7eb4423"} Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.969789 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.970458 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:35 crc kubenswrapper[4715]: E1009 07:48:35.972376 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:36.472358715 +0000 UTC m=+147.165162723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.985459 4715 patch_prober.go:28] interesting pod/console-operator-58897d9998-dfctz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 09 07:48:35 crc kubenswrapper[4715]: I1009 07:48:35.985542 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-dfctz" podUID="2f04071f-b72a-4234-92e3-1cd5e6987a58" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.003039 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" event={"ID":"abd885f8-e479-48ec-9341-0acbbc3c3ea7","Type":"ContainerStarted","Data":"7df7d8556b0dacecddcb77185092ed9138a22428014d262442c654130eccff9f"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.003594 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" event={"ID":"abd885f8-e479-48ec-9341-0acbbc3c3ea7","Type":"ContainerStarted","Data":"be5f20929c583548891148d985cd62ad610a4c3166929c87ef4116bbc15cce08"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.014345 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-hbzph" podStartSLOduration=125.014318228 podStartE2EDuration="2m5.014318228s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:35.982449589 +0000 UTC m=+146.675253597" watchObservedRunningTime="2025-10-09 07:48:36.014318228 +0000 UTC m=+146.707122236" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.015243 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" event={"ID":"bdcbb990-46a3-4a26-a68c-ee9758ef1631","Type":"ContainerStarted","Data":"c441185aceab79c6b32ef1aabb753b6c4fe645d5f20142e064a9d74ec936100e"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.016128 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.050916 4715 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-nz9jb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.050986 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" podUID="bdcbb990-46a3-4a26-a68c-ee9758ef1631" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.068715 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-w58s7" podStartSLOduration=125.068687163 podStartE2EDuration="2m5.068687163s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:36.044807547 +0000 UTC m=+146.737611545" watchObservedRunningTime="2025-10-09 07:48:36.068687163 +0000 UTC m=+146.761491171" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.082646 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:36 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:36 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:36 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.082714 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.083936 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" event={"ID":"41f83b9f-ebe1-42c4-ae44-36775e449efe","Type":"ContainerStarted","Data":"ec42d8cccd5912c38313cd39a5565af89a17e2c7d07f1843f1e6ce4fbe66bd39"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.083998 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" event={"ID":"41f83b9f-ebe1-42c4-ae44-36775e449efe","Type":"ContainerStarted","Data":"09d92dd9965f5338de2932a46b3df09ace21d328d62f52e6dcee20cb44428896"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.085202 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.105474 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:36 crc kubenswrapper[4715]: E1009 07:48:36.108178 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:36.608163073 +0000 UTC m=+147.300967071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.126414 4715 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5vwlz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.126491 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" podUID="41f83b9f-ebe1-42c4-ae44-36775e449efe" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.128758 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" event={"ID":"b5488b64-893e-49f7-9de1-99905faf0d3b","Type":"ContainerStarted","Data":"dbdebe7fb6b1868bd19733b1a993714d1e80e69364e9cbbf110eadb09b967c0a"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.148071 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-trqg4" podStartSLOduration=125.148045446 podStartE2EDuration="2m5.148045446s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:36.09329218 +0000 UTC m=+146.786096198" watchObservedRunningTime="2025-10-09 07:48:36.148045446 +0000 UTC m=+146.840849454" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.167534 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-sq956" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.167578 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-sq956" event={"ID":"b75a6e94-9a8f-4789-be04-be1dabfc37c7","Type":"ContainerStarted","Data":"2620f1b863d265c04424115612e66361b678b36e663d537407ae4f217dc6a961"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.172683 4715 generic.go:334] "Generic (PLEG): container finished" podID="3da117f6-b889-480f-b74b-5841bc551658" containerID="4f12a527f8821a4b0533872931938b3042b0ceb3fb2de547869aa8eaa6492124" exitCode=0 Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.173633 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xh68m" event={"ID":"3da117f6-b889-480f-b74b-5841bc551658","Type":"ContainerDied","Data":"4f12a527f8821a4b0533872931938b3042b0ceb3fb2de547869aa8eaa6492124"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.187493 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" podStartSLOduration=125.187476045 podStartE2EDuration="2m5.187476045s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:36.186723253 +0000 UTC m=+146.879527261" watchObservedRunningTime="2025-10-09 07:48:36.187476045 +0000 UTC m=+146.880280053" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.201272 4715 patch_prober.go:28] interesting pod/downloads-7954f5f757-sq956 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.202041 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sq956" podUID="b75a6e94-9a8f-4789-be04-be1dabfc37c7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.207409 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" podStartSLOduration=125.207386126 podStartE2EDuration="2m5.207386126s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:36.203349548 +0000 UTC m=+146.896153556" watchObservedRunningTime="2025-10-09 07:48:36.207386126 +0000 UTC m=+146.900190134" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.209649 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:36 crc kubenswrapper[4715]: E1009 07:48:36.210595 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:36.710562428 +0000 UTC m=+147.403366436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.228657 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" event={"ID":"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0","Type":"ContainerStarted","Data":"39cebf34dd6db42a6c70b7a8068900fdf7dbeb6749bee3862208d2cfddb84ffa"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.246116 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" podStartSLOduration=125.246085314 podStartE2EDuration="2m5.246085314s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:36.232522819 +0000 UTC m=+146.925326827" watchObservedRunningTime="2025-10-09 07:48:36.246085314 +0000 UTC m=+146.938889322" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.258995 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zrjcb" podStartSLOduration=125.25896893 podStartE2EDuration="2m5.25896893s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:36.256742965 +0000 UTC m=+146.949546973" watchObservedRunningTime="2025-10-09 07:48:36.25896893 +0000 UTC m=+146.951772938" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.272167 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" event={"ID":"03c8f335-ee7a-4f93-9a1f-47247090dffd","Type":"ContainerStarted","Data":"bc57295dde20e3f1b3cb07bf31b8617dcf2f071c38db8bab610c8898da55caf8"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.272647 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" event={"ID":"03c8f335-ee7a-4f93-9a1f-47247090dffd","Type":"ContainerStarted","Data":"55abd3930e1bb859f9b16d156c28cbef674149769f77b4f20ecebc156c93d1c6"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.286642 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" event={"ID":"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec","Type":"ContainerStarted","Data":"c8339ebe5a987752bad6b497baef860041697998111487689f00920b5a239b8f"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.290003 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.296640 4715 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hzbn9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.296710 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" podUID="9040c858-1f6b-4900-b34e-c8b0b0c4c1ec" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.297944 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-sq956" podStartSLOduration=125.297931825 podStartE2EDuration="2m5.297931825s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:36.297268066 +0000 UTC m=+146.990072084" watchObservedRunningTime="2025-10-09 07:48:36.297931825 +0000 UTC m=+146.990735833" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.303217 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" event={"ID":"f6483da2-54be-4754-9da9-7ad2af3788b3","Type":"ContainerStarted","Data":"d6a76535754a525770188a95b0d7e5f7ec223744fb571541a874f69f466e6982"} Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.311990 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:36 crc kubenswrapper[4715]: E1009 07:48:36.314191 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:36.814175079 +0000 UTC m=+147.506979087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.321137 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6gwtn" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.354977 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-qf4bm" podStartSLOduration=125.354951477 podStartE2EDuration="2m5.354951477s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:36.351174267 +0000 UTC m=+147.043978275" watchObservedRunningTime="2025-10-09 07:48:36.354951477 +0000 UTC m=+147.047755475" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.381437 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" podStartSLOduration=125.381372718 podStartE2EDuration="2m5.381372718s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:36.379228275 +0000 UTC m=+147.072032283" watchObservedRunningTime="2025-10-09 07:48:36.381372718 +0000 UTC m=+147.074176746" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.414746 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:36 crc kubenswrapper[4715]: E1009 07:48:36.416219 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:36.916177692 +0000 UTC m=+147.608981700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.451346 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9grfz" podStartSLOduration=125.451321937 podStartE2EDuration="2m5.451321937s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:36.405848021 +0000 UTC m=+147.098652029" watchObservedRunningTime="2025-10-09 07:48:36.451321937 +0000 UTC m=+147.144125945" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.518300 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:36 crc kubenswrapper[4715]: E1009 07:48:36.518967 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.018954938 +0000 UTC m=+147.711758946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.620076 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:36 crc kubenswrapper[4715]: E1009 07:48:36.620340 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.120289252 +0000 UTC m=+147.813093250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.620813 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:36 crc kubenswrapper[4715]: E1009 07:48:36.621299 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.121290931 +0000 UTC m=+147.814094939 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.674749 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-z9cgk" Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.722532 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:36 crc kubenswrapper[4715]: E1009 07:48:36.722766 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.222724478 +0000 UTC m=+147.915528486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.723308 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:36 crc kubenswrapper[4715]: E1009 07:48:36.723862 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.22383355 +0000 UTC m=+147.916637558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.826065 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:36 crc kubenswrapper[4715]: E1009 07:48:36.826443 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.326407949 +0000 UTC m=+148.019211957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:36 crc kubenswrapper[4715]: I1009 07:48:36.928101 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:36 crc kubenswrapper[4715]: E1009 07:48:36.928657 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.428633619 +0000 UTC m=+148.121437627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.029750 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.029959 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.529920012 +0000 UTC m=+148.222724020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.030103 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.030522 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.530513619 +0000 UTC m=+148.223317627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.088477 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:37 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:37 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:37 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.088812 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.131152 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.131703 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.631685028 +0000 UTC m=+148.324489036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.205608 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.232846 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.233292 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.733270849 +0000 UTC m=+148.426075048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.322537 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hbn7p" event={"ID":"19912eec-ab9f-4e07-8458-6867269f1a42","Type":"ContainerStarted","Data":"89155e914eaab83637ea868c00ed983175c71806efb0502513218de1d1e605d4"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.322595 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hbn7p" event={"ID":"19912eec-ab9f-4e07-8458-6867269f1a42","Type":"ContainerStarted","Data":"b5604cf49a5929931cd26d0d350ee43fa21a92859c4ed9d730370c109befac37"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.326410 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xh68m" event={"ID":"3da117f6-b889-480f-b74b-5841bc551658","Type":"ContainerStarted","Data":"1b2e59719c736734ca8215b1efa82657f82c1b490e151f8960b15d68be7d91c1"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.326480 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xh68m" event={"ID":"3da117f6-b889-480f-b74b-5841bc551658","Type":"ContainerStarted","Data":"a07028f48582ef4c7474c4524a04a800f4b30f320ca8327f87fa20045271f5fa"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.328405 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" event={"ID":"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec","Type":"ContainerStarted","Data":"8bc7e98303a37386f38b308d46945b4d4e2702ff4ae3a09783fb2ea83ef8a388"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.329450 4715 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hzbn9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.329515 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" podUID="9040c858-1f6b-4900-b34e-c8b0b0c4c1ec" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.330398 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-575dw" event={"ID":"8130c7b7-7b74-461e-8348-59345d86aa6b","Type":"ContainerStarted","Data":"778aba8d550639ffe7a9841715a1943decc16cebfb241ab2b9ac4d947241081a"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.333863 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.334074 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.834033757 +0000 UTC m=+148.526837765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.334173 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.334586 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.834568742 +0000 UTC m=+148.527372760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.337145 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" event={"ID":"0bcd21f4-d2ef-4f93-8381-c85574a627e8","Type":"ContainerStarted","Data":"e6b463611fe3dd5b9389ea2f3f6f6be823e063b89230805f087c5ad3dc8d28dc"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.340195 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" event={"ID":"4005d046-0643-40a6-a748-8ecacb0f1541","Type":"ContainerStarted","Data":"89e2f36b021a7de39f8e72f94f3ceebe0d26d6cd3790c3c2c9038236bffa4752"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.340481 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.344149 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7wthp" event={"ID":"76a49d30-4e29-483e-8837-f4cbcb919e06","Type":"ContainerStarted","Data":"cbb035bf2de7413319011c156b2ae2fa4b8c83052a74dc636626fd5419a68ecc"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.344191 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7wthp" event={"ID":"76a49d30-4e29-483e-8837-f4cbcb919e06","Type":"ContainerStarted","Data":"5a7e2af0f3597fecf9b76a25efbbb617d06faf4337b2e2343643697390bd68e3"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.344522 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.347741 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hbn7p" podStartSLOduration=126.347720516 podStartE2EDuration="2m6.347720516s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:37.345045508 +0000 UTC m=+148.037849516" watchObservedRunningTime="2025-10-09 07:48:37.347720516 +0000 UTC m=+148.040524524" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.348627 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4gncq" event={"ID":"5f316b42-23b5-4041-9dc3-3b95676339e5","Type":"ContainerStarted","Data":"61463345f7f0817bd395e80ba2b9ac461e551f213e683c67b9c0ef2173e06718"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.348705 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4gncq" event={"ID":"5f316b42-23b5-4041-9dc3-3b95676339e5","Type":"ContainerStarted","Data":"b468773d3a5850719fdb407c032b89fce9b0df51d308068d0c6b18fc5e9f678b"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.356148 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" event={"ID":"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0","Type":"ContainerStarted","Data":"aef194710c23d7b2a1d375bbc11cb97537a560ccb05e5eb78fdcede83876fed5"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.356221 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" event={"ID":"2ca5fc0f-ee77-4a25-985e-8b50fbe3ddf0","Type":"ContainerStarted","Data":"498b2ad11109c46885956b4acb1a66c7e2486d54f7aa19c2a34f35e4b30bb258"} Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.357886 4715 patch_prober.go:28] interesting pod/downloads-7954f5f757-sq956 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.357960 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sq956" podUID="b75a6e94-9a8f-4789-be04-be1dabfc37c7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.365336 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.366274 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nz9jb" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.379054 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sw6f9" podStartSLOduration=126.379029128 podStartE2EDuration="2m6.379029128s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:37.377905776 +0000 UTC m=+148.070709794" watchObservedRunningTime="2025-10-09 07:48:37.379029128 +0000 UTC m=+148.071833136" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.386061 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.425640 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7wthp" podStartSLOduration=8.425614916 podStartE2EDuration="8.425614916s" podCreationTimestamp="2025-10-09 07:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:37.42503765 +0000 UTC m=+148.117841658" watchObservedRunningTime="2025-10-09 07:48:37.425614916 +0000 UTC m=+148.118418924" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.435185 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.435592 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.935562566 +0000 UTC m=+148.628366644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.435690 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.437995 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:37.937978137 +0000 UTC m=+148.630782145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.458535 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ttk8w" podStartSLOduration=126.458510585 podStartE2EDuration="2m6.458510585s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:37.456249639 +0000 UTC m=+148.149053647" watchObservedRunningTime="2025-10-09 07:48:37.458510585 +0000 UTC m=+148.151314593" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.521017 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xh68m" podStartSLOduration=126.520991897 podStartE2EDuration="2m6.520991897s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:37.484806982 +0000 UTC m=+148.177610990" watchObservedRunningTime="2025-10-09 07:48:37.520991897 +0000 UTC m=+148.213795895" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.538523 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.539121 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.039099734 +0000 UTC m=+148.731903742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.555620 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w79tf" podStartSLOduration=126.555593515 podStartE2EDuration="2m6.555593515s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:37.555471332 +0000 UTC m=+148.248275350" watchObservedRunningTime="2025-10-09 07:48:37.555593515 +0000 UTC m=+148.248397523" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.582015 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-4gncq" podStartSLOduration=126.581987765 podStartE2EDuration="2m6.581987765s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:37.581332785 +0000 UTC m=+148.274136793" watchObservedRunningTime="2025-10-09 07:48:37.581987765 +0000 UTC m=+148.274791773" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.637254 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5vwlz" Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.642006 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.642626 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.142606352 +0000 UTC m=+148.835410360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.743178 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.743859 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.243834742 +0000 UTC m=+148.936638740 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.847048 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.847501 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.347487254 +0000 UTC m=+149.040291262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.948006 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.948216 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.448176979 +0000 UTC m=+149.140980987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.948327 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:37 crc kubenswrapper[4715]: E1009 07:48:37.948684 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.448671033 +0000 UTC m=+149.141475041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:37 crc kubenswrapper[4715]: I1009 07:48:37.985132 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-dfctz" Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.048972 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.049212 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.549174243 +0000 UTC m=+149.241978251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.049294 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.049646 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.549632206 +0000 UTC m=+149.242436214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.081733 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:38 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:38 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:38 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.081882 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.150793 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.151078 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.651031122 +0000 UTC m=+149.343835130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.151209 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.151728 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.651716142 +0000 UTC m=+149.344520330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.252627 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.252826 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.752794539 +0000 UTC m=+149.445598537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.252986 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.253316 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.753306674 +0000 UTC m=+149.446110682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.354164 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.354335 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.854297328 +0000 UTC m=+149.547101336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.354512 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.354884 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.854869804 +0000 UTC m=+149.547673812 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.360993 4715 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hzbn9 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.361080 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" podUID="9040c858-1f6b-4900-b34e-c8b0b0c4c1ec" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.456184 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.456541 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.956487316 +0000 UTC m=+149.649291324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.457558 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.459693 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:38.959675539 +0000 UTC m=+149.652479757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.558694 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.558964 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.058925112 +0000 UTC m=+149.751729160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.559243 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.559681 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.059666184 +0000 UTC m=+149.752470192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.660533 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.660747 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.16071307 +0000 UTC m=+149.853517078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.660814 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.661183 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.161166433 +0000 UTC m=+149.853970441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.762544 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.762786 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.262742654 +0000 UTC m=+149.955546662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.763128 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.763525 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.263505316 +0000 UTC m=+149.956309324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.864471 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.864724 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.364686616 +0000 UTC m=+150.057490634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.966454 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.966543 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.966599 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:38 crc kubenswrapper[4715]: E1009 07:48:38.966984 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.466954947 +0000 UTC m=+150.159759135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.967679 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:38 crc kubenswrapper[4715]: I1009 07:48:38.975209 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.068484 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:39 crc kubenswrapper[4715]: E1009 07:48:39.068704 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.568668842 +0000 UTC m=+150.261472850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.069796 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.069954 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.070096 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:39 crc kubenswrapper[4715]: E1009 07:48:39.070175 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.570166905 +0000 UTC m=+150.262970903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.076116 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.086647 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.091677 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:39 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:39 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:39 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.091763 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.104230 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xqkhk"] Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.105377 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.110621 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.118825 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xqkhk"] Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.161834 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.171729 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.171962 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e13129f-063f-400f-b483-537273d66d74-utilities\") pod \"certified-operators-xqkhk\" (UID: \"4e13129f-063f-400f-b483-537273d66d74\") " pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.172029 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs9sm\" (UniqueName: \"kubernetes.io/projected/4e13129f-063f-400f-b483-537273d66d74-kube-api-access-vs9sm\") pod \"certified-operators-xqkhk\" (UID: \"4e13129f-063f-400f-b483-537273d66d74\") " pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.172080 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e13129f-063f-400f-b483-537273d66d74-catalog-content\") pod \"certified-operators-xqkhk\" (UID: \"4e13129f-063f-400f-b483-537273d66d74\") " pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:48:39 crc kubenswrapper[4715]: E1009 07:48:39.172185 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.672168669 +0000 UTC m=+150.364972677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.179369 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.191694 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.274716 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e13129f-063f-400f-b483-537273d66d74-catalog-content\") pod \"certified-operators-xqkhk\" (UID: \"4e13129f-063f-400f-b483-537273d66d74\") " pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.275190 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e13129f-063f-400f-b483-537273d66d74-utilities\") pod \"certified-operators-xqkhk\" (UID: \"4e13129f-063f-400f-b483-537273d66d74\") " pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.275283 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs9sm\" (UniqueName: \"kubernetes.io/projected/4e13129f-063f-400f-b483-537273d66d74-kube-api-access-vs9sm\") pod \"certified-operators-xqkhk\" (UID: \"4e13129f-063f-400f-b483-537273d66d74\") " pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.275308 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.275323 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e13129f-063f-400f-b483-537273d66d74-catalog-content\") pod \"certified-operators-xqkhk\" (UID: \"4e13129f-063f-400f-b483-537273d66d74\") " pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.275671 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e13129f-063f-400f-b483-537273d66d74-utilities\") pod \"certified-operators-xqkhk\" (UID: \"4e13129f-063f-400f-b483-537273d66d74\") " pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:48:39 crc kubenswrapper[4715]: E1009 07:48:39.275739 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.775722557 +0000 UTC m=+150.468526565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.297404 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6kpjv"] Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.302718 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.305257 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.326638 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6kpjv"] Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.327719 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs9sm\" (UniqueName: \"kubernetes.io/projected/4e13129f-063f-400f-b483-537273d66d74-kube-api-access-vs9sm\") pod \"certified-operators-xqkhk\" (UID: \"4e13129f-063f-400f-b483-537273d66d74\") " pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.390939 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.391167 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba28cf2-dac1-47a3-9efe-727e793c7afd-utilities\") pod \"community-operators-6kpjv\" (UID: \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\") " pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.391198 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba28cf2-dac1-47a3-9efe-727e793c7afd-catalog-content\") pod \"community-operators-6kpjv\" (UID: \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\") " pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.391234 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5dfz\" (UniqueName: \"kubernetes.io/projected/8ba28cf2-dac1-47a3-9efe-727e793c7afd-kube-api-access-w5dfz\") pod \"community-operators-6kpjv\" (UID: \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\") " pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:48:39 crc kubenswrapper[4715]: E1009 07:48:39.391302 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.891266745 +0000 UTC m=+150.584070753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.393146 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-575dw" event={"ID":"8130c7b7-7b74-461e-8348-59345d86aa6b","Type":"ContainerStarted","Data":"66e6718c6700c5ff359955b312608176a7951e9d426468fad58739607e75f2c0"} Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.393223 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-575dw" event={"ID":"8130c7b7-7b74-461e-8348-59345d86aa6b","Type":"ContainerStarted","Data":"5a96a95f5bcac85a0169765b669face387f1c003256dc89650ad5197e1c5f843"} Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.395859 4715 generic.go:334] "Generic (PLEG): container finished" podID="b5488b64-893e-49f7-9de1-99905faf0d3b" containerID="dbdebe7fb6b1868bd19733b1a993714d1e80e69364e9cbbf110eadb09b967c0a" exitCode=0 Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.396245 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" event={"ID":"b5488b64-893e-49f7-9de1-99905faf0d3b","Type":"ContainerDied","Data":"dbdebe7fb6b1868bd19733b1a993714d1e80e69364e9cbbf110eadb09b967c0a"} Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.465043 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:48:39 crc kubenswrapper[4715]: W1009 07:48:39.468853 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-e098defba91afd1ed29c7f7a0ea4284e77f8e19a8cef199798eafec733eaefd1 WatchSource:0}: Error finding container e098defba91afd1ed29c7f7a0ea4284e77f8e19a8cef199798eafec733eaefd1: Status 404 returned error can't find the container with id e098defba91afd1ed29c7f7a0ea4284e77f8e19a8cef199798eafec733eaefd1 Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.487831 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7v8fg"] Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.488909 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.495065 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5dfz\" (UniqueName: \"kubernetes.io/projected/8ba28cf2-dac1-47a3-9efe-727e793c7afd-kube-api-access-w5dfz\") pod \"community-operators-6kpjv\" (UID: \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\") " pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.495176 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:39 crc kubenswrapper[4715]: E1009 07:48:39.496134 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:39.996112362 +0000 UTC m=+150.688916370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.496803 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba28cf2-dac1-47a3-9efe-727e793c7afd-utilities\") pod \"community-operators-6kpjv\" (UID: \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\") " pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.496919 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba28cf2-dac1-47a3-9efe-727e793c7afd-catalog-content\") pod \"community-operators-6kpjv\" (UID: \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\") " pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.497980 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba28cf2-dac1-47a3-9efe-727e793c7afd-catalog-content\") pod \"community-operators-6kpjv\" (UID: \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\") " pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.498045 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba28cf2-dac1-47a3-9efe-727e793c7afd-utilities\") pod \"community-operators-6kpjv\" (UID: \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\") " pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.502307 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7v8fg"] Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.516810 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5dfz\" (UniqueName: \"kubernetes.io/projected/8ba28cf2-dac1-47a3-9efe-727e793c7afd-kube-api-access-w5dfz\") pod \"community-operators-6kpjv\" (UID: \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\") " pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:48:39 crc kubenswrapper[4715]: W1009 07:48:39.536386 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-4dffa8337b3ea06f8caf7b300ca2f2f2a68d7e43fa3a3fd38685044d4c53d9e3 WatchSource:0}: Error finding container 4dffa8337b3ea06f8caf7b300ca2f2f2a68d7e43fa3a3fd38685044d4c53d9e3: Status 404 returned error can't find the container with id 4dffa8337b3ea06f8caf7b300ca2f2f2a68d7e43fa3a3fd38685044d4c53d9e3 Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.597742 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.598037 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/856c1393-8838-4544-9769-a1055c252169-catalog-content\") pod \"certified-operators-7v8fg\" (UID: \"856c1393-8838-4544-9769-a1055c252169\") " pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.598097 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvl7x\" (UniqueName: \"kubernetes.io/projected/856c1393-8838-4544-9769-a1055c252169-kube-api-access-qvl7x\") pod \"certified-operators-7v8fg\" (UID: \"856c1393-8838-4544-9769-a1055c252169\") " pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:48:39 crc kubenswrapper[4715]: E1009 07:48:39.598323 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:40.0982988 +0000 UTC m=+150.791102808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.598493 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/856c1393-8838-4544-9769-a1055c252169-utilities\") pod \"certified-operators-7v8fg\" (UID: \"856c1393-8838-4544-9769-a1055c252169\") " pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.598959 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:39 crc kubenswrapper[4715]: E1009 07:48:39.599745 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:40.099730662 +0000 UTC m=+150.792534670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.603078 4715 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.661249 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.703544 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.703926 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/856c1393-8838-4544-9769-a1055c252169-catalog-content\") pod \"certified-operators-7v8fg\" (UID: \"856c1393-8838-4544-9769-a1055c252169\") " pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.703966 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvl7x\" (UniqueName: \"kubernetes.io/projected/856c1393-8838-4544-9769-a1055c252169-kube-api-access-qvl7x\") pod \"certified-operators-7v8fg\" (UID: \"856c1393-8838-4544-9769-a1055c252169\") " pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:48:39 crc kubenswrapper[4715]: E1009 07:48:39.704005 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:40.203963981 +0000 UTC m=+150.896768139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.704066 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/856c1393-8838-4544-9769-a1055c252169-utilities\") pod \"certified-operators-7v8fg\" (UID: \"856c1393-8838-4544-9769-a1055c252169\") " pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.704974 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/856c1393-8838-4544-9769-a1055c252169-catalog-content\") pod \"certified-operators-7v8fg\" (UID: \"856c1393-8838-4544-9769-a1055c252169\") " pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.705063 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/856c1393-8838-4544-9769-a1055c252169-utilities\") pod \"certified-operators-7v8fg\" (UID: \"856c1393-8838-4544-9769-a1055c252169\") " pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.705665 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7f9jj"] Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.722987 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.742756 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvl7x\" (UniqueName: \"kubernetes.io/projected/856c1393-8838-4544-9769-a1055c252169-kube-api-access-qvl7x\") pod \"certified-operators-7v8fg\" (UID: \"856c1393-8838-4544-9769-a1055c252169\") " pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.759718 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7f9jj"] Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.806358 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4e27ff-70ea-4095-b26a-e787d60bf751-utilities\") pod \"community-operators-7f9jj\" (UID: \"fa4e27ff-70ea-4095-b26a-e787d60bf751\") " pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.806477 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.806528 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-968hh\" (UniqueName: \"kubernetes.io/projected/fa4e27ff-70ea-4095-b26a-e787d60bf751-kube-api-access-968hh\") pod \"community-operators-7f9jj\" (UID: \"fa4e27ff-70ea-4095-b26a-e787d60bf751\") " pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.806566 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4e27ff-70ea-4095-b26a-e787d60bf751-catalog-content\") pod \"community-operators-7f9jj\" (UID: \"fa4e27ff-70ea-4095-b26a-e787d60bf751\") " pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:48:39 crc kubenswrapper[4715]: E1009 07:48:39.807057 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:40.307037985 +0000 UTC m=+150.999841993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.813447 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xqkhk"] Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.830249 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:48:39 crc kubenswrapper[4715]: W1009 07:48:39.835307 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e13129f_063f_400f_b483_537273d66d74.slice/crio-4414765f584e593c671cb30ecfb9bc8ccc3419ffd34d557357de41b5bd296a57 WatchSource:0}: Error finding container 4414765f584e593c671cb30ecfb9bc8ccc3419ffd34d557357de41b5bd296a57: Status 404 returned error can't find the container with id 4414765f584e593c671cb30ecfb9bc8ccc3419ffd34d557357de41b5bd296a57 Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.908344 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:39 crc kubenswrapper[4715]: E1009 07:48:39.908608 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:40.408568365 +0000 UTC m=+151.101372373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.908677 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4e27ff-70ea-4095-b26a-e787d60bf751-utilities\") pod \"community-operators-7f9jj\" (UID: \"fa4e27ff-70ea-4095-b26a-e787d60bf751\") " pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.908867 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.908954 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-968hh\" (UniqueName: \"kubernetes.io/projected/fa4e27ff-70ea-4095-b26a-e787d60bf751-kube-api-access-968hh\") pod \"community-operators-7f9jj\" (UID: \"fa4e27ff-70ea-4095-b26a-e787d60bf751\") " pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.909050 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4e27ff-70ea-4095-b26a-e787d60bf751-catalog-content\") pod \"community-operators-7f9jj\" (UID: \"fa4e27ff-70ea-4095-b26a-e787d60bf751\") " pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:48:39 crc kubenswrapper[4715]: E1009 07:48:39.909258 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:40.409239144 +0000 UTC m=+151.102043152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.909258 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4e27ff-70ea-4095-b26a-e787d60bf751-utilities\") pod \"community-operators-7f9jj\" (UID: \"fa4e27ff-70ea-4095-b26a-e787d60bf751\") " pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.909553 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4e27ff-70ea-4095-b26a-e787d60bf751-catalog-content\") pod \"community-operators-7f9jj\" (UID: \"fa4e27ff-70ea-4095-b26a-e787d60bf751\") " pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.936226 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-968hh\" (UniqueName: \"kubernetes.io/projected/fa4e27ff-70ea-4095-b26a-e787d60bf751-kube-api-access-968hh\") pod \"community-operators-7f9jj\" (UID: \"fa4e27ff-70ea-4095-b26a-e787d60bf751\") " pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:48:39 crc kubenswrapper[4715]: I1009 07:48:39.953294 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6kpjv"] Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.010918 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:40 crc kubenswrapper[4715]: E1009 07:48:40.011115 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:40.511081873 +0000 UTC m=+151.203885881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.011770 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:40 crc kubenswrapper[4715]: E1009 07:48:40.012188 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:40.512173075 +0000 UTC m=+151.204977083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.062683 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.081630 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.082377 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:40 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:40 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:40 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.082447 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.082567 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.090277 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.090977 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.097378 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.104898 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7v8fg"] Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.112682 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:40 crc kubenswrapper[4715]: E1009 07:48:40.112817 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:40.612773817 +0000 UTC m=+151.305577825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.113052 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:40 crc kubenswrapper[4715]: E1009 07:48:40.113447 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:40.613438317 +0000 UTC m=+151.306242325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.215014 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.215443 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdfa926f-39ff-4fbf-8265-2ebf3bc796bb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.215486 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdfa926f-39ff-4fbf-8265-2ebf3bc796bb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 07:48:40 crc kubenswrapper[4715]: E1009 07:48:40.215673 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-09 07:48:40.715628226 +0000 UTC m=+151.408432274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.215841 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:40 crc kubenswrapper[4715]: E1009 07:48:40.220694 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-09 07:48:40.720652222 +0000 UTC m=+151.413456230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-txj6v" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.295212 4715 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-09T07:48:39.603109311Z","Handler":null,"Name":""} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.301951 4715 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.302009 4715 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.321183 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.326782 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdfa926f-39ff-4fbf-8265-2ebf3bc796bb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.326851 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdfa926f-39ff-4fbf-8265-2ebf3bc796bb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.326917 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdfa926f-39ff-4fbf-8265-2ebf3bc796bb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.329338 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.351181 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7f9jj"] Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.359273 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdfa926f-39ff-4fbf-8265-2ebf3bc796bb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.404947 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1e2bed06372b2b7be1aa8cd7c1551694e9cd3eb8221005ba2b53dfcabafb7572"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.405268 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e098defba91afd1ed29c7f7a0ea4284e77f8e19a8cef199798eafec733eaefd1"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.405923 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f9jj" event={"ID":"fa4e27ff-70ea-4095-b26a-e787d60bf751","Type":"ContainerStarted","Data":"90eb69fe03b364fa70cf270f631500412f60824b10660aa32e847ba1a29a5bce"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.407119 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.407241 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7v8fg" event={"ID":"856c1393-8838-4544-9769-a1055c252169","Type":"ContainerStarted","Data":"37b2512def9399ffa9b26fe6d7d92f71f5a7c33ae649ebf59ce4e28389995932"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.411545 4715 generic.go:334] "Generic (PLEG): container finished" podID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" containerID="ca226250f6e8d32c2d578041e357084df7d6f07d373c89e1e85f8c8886ed1061" exitCode=0 Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.411605 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kpjv" event={"ID":"8ba28cf2-dac1-47a3-9efe-727e793c7afd","Type":"ContainerDied","Data":"ca226250f6e8d32c2d578041e357084df7d6f07d373c89e1e85f8c8886ed1061"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.411655 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kpjv" event={"ID":"8ba28cf2-dac1-47a3-9efe-727e793c7afd","Type":"ContainerStarted","Data":"100b2666857442dca3fa066d82574c5817fbfd95da32bafc9b64d23152a16224"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.413409 4715 generic.go:334] "Generic (PLEG): container finished" podID="4e13129f-063f-400f-b483-537273d66d74" containerID="2035ba782adda00452929c08cd4d85be1179f0fc7ffbe7d0882cc125801e87a0" exitCode=0 Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.413486 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xqkhk" event={"ID":"4e13129f-063f-400f-b483-537273d66d74","Type":"ContainerDied","Data":"2035ba782adda00452929c08cd4d85be1179f0fc7ffbe7d0882cc125801e87a0"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.413503 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xqkhk" event={"ID":"4e13129f-063f-400f-b483-537273d66d74","Type":"ContainerStarted","Data":"4414765f584e593c671cb30ecfb9bc8ccc3419ffd34d557357de41b5bd296a57"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.416727 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-575dw" event={"ID":"8130c7b7-7b74-461e-8348-59345d86aa6b","Type":"ContainerStarted","Data":"a3306ca1b5efd92076ddf0f350af053abe2ff23c160fcf93fcd3e9d3eba64e37"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.419576 4715 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.423039 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0d3b78550c568f5e5c26cebeef5e5c3094c26331a353d62384d1ffe876f90d25"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.423098 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4dffa8337b3ea06f8caf7b300ca2f2f2a68d7e43fa3a3fd38685044d4c53d9e3"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.434779 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.435964 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5134aee2f4284aa5f253395bf890b236bbf3388ce80fef2361efc1253a42d0b8"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.436007 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d64e49a67a652ef31b5fa7e5c616097b8187926917aac5e3b3c38a147c4d35f4"} Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.436674 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.506549 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-575dw" podStartSLOduration=11.506523274 podStartE2EDuration="11.506523274s" podCreationTimestamp="2025-10-09 07:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:40.500462008 +0000 UTC m=+151.193266026" watchObservedRunningTime="2025-10-09 07:48:40.506523274 +0000 UTC m=+151.199327282" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.671109 4715 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.671188 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.733022 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.784265 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-txj6v\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.840592 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5488b64-893e-49f7-9de1-99905faf0d3b-secret-volume\") pod \"b5488b64-893e-49f7-9de1-99905faf0d3b\" (UID: \"b5488b64-893e-49f7-9de1-99905faf0d3b\") " Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.840654 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5488b64-893e-49f7-9de1-99905faf0d3b-config-volume\") pod \"b5488b64-893e-49f7-9de1-99905faf0d3b\" (UID: \"b5488b64-893e-49f7-9de1-99905faf0d3b\") " Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.840746 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbr4p\" (UniqueName: \"kubernetes.io/projected/b5488b64-893e-49f7-9de1-99905faf0d3b-kube-api-access-rbr4p\") pod \"b5488b64-893e-49f7-9de1-99905faf0d3b\" (UID: \"b5488b64-893e-49f7-9de1-99905faf0d3b\") " Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.841615 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5488b64-893e-49f7-9de1-99905faf0d3b-config-volume" (OuterVolumeSpecName: "config-volume") pod "b5488b64-893e-49f7-9de1-99905faf0d3b" (UID: "b5488b64-893e-49f7-9de1-99905faf0d3b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.846483 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5488b64-893e-49f7-9de1-99905faf0d3b-kube-api-access-rbr4p" (OuterVolumeSpecName: "kube-api-access-rbr4p") pod "b5488b64-893e-49f7-9de1-99905faf0d3b" (UID: "b5488b64-893e-49f7-9de1-99905faf0d3b"). InnerVolumeSpecName "kube-api-access-rbr4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.846802 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5488b64-893e-49f7-9de1-99905faf0d3b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b5488b64-893e-49f7-9de1-99905faf0d3b" (UID: "b5488b64-893e-49f7-9de1-99905faf0d3b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.889524 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.944811 4715 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5488b64-893e-49f7-9de1-99905faf0d3b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.944849 4715 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5488b64-893e-49f7-9de1-99905faf0d3b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 07:48:40 crc kubenswrapper[4715]: I1009 07:48:40.944859 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbr4p\" (UniqueName: \"kubernetes.io/projected/b5488b64-893e-49f7-9de1-99905faf0d3b-kube-api-access-rbr4p\") on node \"crc\" DevicePath \"\"" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.081980 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:41 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:41 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:41 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.082073 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.084407 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.295390 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vgjp2"] Oct 09 07:48:41 crc kubenswrapper[4715]: E1009 07:48:41.296013 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5488b64-893e-49f7-9de1-99905faf0d3b" containerName="collect-profiles" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.296031 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5488b64-893e-49f7-9de1-99905faf0d3b" containerName="collect-profiles" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.296157 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5488b64-893e-49f7-9de1-99905faf0d3b" containerName="collect-profiles" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.297079 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.301974 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.308718 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgjp2"] Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.339049 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-txj6v"] Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.444340 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" event={"ID":"915f6370-d5b2-4c9e-a1b1-c3146612b3ce","Type":"ContainerStarted","Data":"c91ec32804e7dd9724bc0abc250bfbd6877343c051977727a5ecc3e6ec7a7d9f"} Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.450541 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcafd988-9f86-4c66-8fa7-ead624b101a0-utilities\") pod \"redhat-marketplace-vgjp2\" (UID: \"fcafd988-9f86-4c66-8fa7-ead624b101a0\") " pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.450621 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcafd988-9f86-4c66-8fa7-ead624b101a0-catalog-content\") pod \"redhat-marketplace-vgjp2\" (UID: \"fcafd988-9f86-4c66-8fa7-ead624b101a0\") " pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.451135 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wqvs\" (UniqueName: \"kubernetes.io/projected/fcafd988-9f86-4c66-8fa7-ead624b101a0-kube-api-access-7wqvs\") pod \"redhat-marketplace-vgjp2\" (UID: \"fcafd988-9f86-4c66-8fa7-ead624b101a0\") " pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.451930 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.452793 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85" event={"ID":"b5488b64-893e-49f7-9de1-99905faf0d3b","Type":"ContainerDied","Data":"81695112128c47caf200c552442949a79c2ac56ac1dec50c669cb4bfbd24f596"} Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.452881 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81695112128c47caf200c552442949a79c2ac56ac1dec50c669cb4bfbd24f596" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.456712 4715 generic.go:334] "Generic (PLEG): container finished" podID="fa4e27ff-70ea-4095-b26a-e787d60bf751" containerID="30b37245d901143b13f79d14c73f6d653a1450d86730a4369a8e78d497021cd0" exitCode=0 Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.456788 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f9jj" event={"ID":"fa4e27ff-70ea-4095-b26a-e787d60bf751","Type":"ContainerDied","Data":"30b37245d901143b13f79d14c73f6d653a1450d86730a4369a8e78d497021cd0"} Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.462862 4715 generic.go:334] "Generic (PLEG): container finished" podID="856c1393-8838-4544-9769-a1055c252169" containerID="bc18cd608efd241915200069b1f390a8a628de5e97bbb807e06d3bfeb76f934b" exitCode=0 Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.462956 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7v8fg" event={"ID":"856c1393-8838-4544-9769-a1055c252169","Type":"ContainerDied","Data":"bc18cd608efd241915200069b1f390a8a628de5e97bbb807e06d3bfeb76f934b"} Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.470410 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb","Type":"ContainerStarted","Data":"c42313337baa86d793c42f3dd16e50c34d10d1a0d713378cc6c1d869914daf3d"} Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.470532 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb","Type":"ContainerStarted","Data":"fdae42b9dd57a27434d01ffba31d58dd9126d016710cd502f47b375ae8bd2d4b"} Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.554362 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcafd988-9f86-4c66-8fa7-ead624b101a0-utilities\") pod \"redhat-marketplace-vgjp2\" (UID: \"fcafd988-9f86-4c66-8fa7-ead624b101a0\") " pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.554502 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcafd988-9f86-4c66-8fa7-ead624b101a0-catalog-content\") pod \"redhat-marketplace-vgjp2\" (UID: \"fcafd988-9f86-4c66-8fa7-ead624b101a0\") " pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.554611 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wqvs\" (UniqueName: \"kubernetes.io/projected/fcafd988-9f86-4c66-8fa7-ead624b101a0-kube-api-access-7wqvs\") pod \"redhat-marketplace-vgjp2\" (UID: \"fcafd988-9f86-4c66-8fa7-ead624b101a0\") " pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.555566 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcafd988-9f86-4c66-8fa7-ead624b101a0-catalog-content\") pod \"redhat-marketplace-vgjp2\" (UID: \"fcafd988-9f86-4c66-8fa7-ead624b101a0\") " pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.556709 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcafd988-9f86-4c66-8fa7-ead624b101a0-utilities\") pod \"redhat-marketplace-vgjp2\" (UID: \"fcafd988-9f86-4c66-8fa7-ead624b101a0\") " pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.581567 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wqvs\" (UniqueName: \"kubernetes.io/projected/fcafd988-9f86-4c66-8fa7-ead624b101a0-kube-api-access-7wqvs\") pod \"redhat-marketplace-vgjp2\" (UID: \"fcafd988-9f86-4c66-8fa7-ead624b101a0\") " pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.621139 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.621979 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.623131 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.628698 4715 patch_prober.go:28] interesting pod/console-f9d7485db-5fdhg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.628756 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5fdhg" podUID="3c1c9983-60a8-4db2-866c-15deb7220cb9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.695087 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-76vrx"] Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.696164 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.721288 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76vrx"] Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.861339 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zdzk\" (UniqueName: \"kubernetes.io/projected/aa605d89-538d-40f1-8aea-e397b0667ff9-kube-api-access-4zdzk\") pod \"redhat-marketplace-76vrx\" (UID: \"aa605d89-538d-40f1-8aea-e397b0667ff9\") " pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.861720 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa605d89-538d-40f1-8aea-e397b0667ff9-utilities\") pod \"redhat-marketplace-76vrx\" (UID: \"aa605d89-538d-40f1-8aea-e397b0667ff9\") " pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.861755 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa605d89-538d-40f1-8aea-e397b0667ff9-catalog-content\") pod \"redhat-marketplace-76vrx\" (UID: \"aa605d89-538d-40f1-8aea-e397b0667ff9\") " pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.875957 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgjp2"] Oct 09 07:48:41 crc kubenswrapper[4715]: W1009 07:48:41.888519 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcafd988_9f86_4c66_8fa7_ead624b101a0.slice/crio-3c92cb8fabe6c9319f79637e4a239ac745b9de0618e35e41618c6601dfe94f7f WatchSource:0}: Error finding container 3c92cb8fabe6c9319f79637e4a239ac745b9de0618e35e41618c6601dfe94f7f: Status 404 returned error can't find the container with id 3c92cb8fabe6c9319f79637e4a239ac745b9de0618e35e41618c6601dfe94f7f Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.963479 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa605d89-538d-40f1-8aea-e397b0667ff9-catalog-content\") pod \"redhat-marketplace-76vrx\" (UID: \"aa605d89-538d-40f1-8aea-e397b0667ff9\") " pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.963592 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zdzk\" (UniqueName: \"kubernetes.io/projected/aa605d89-538d-40f1-8aea-e397b0667ff9-kube-api-access-4zdzk\") pod \"redhat-marketplace-76vrx\" (UID: \"aa605d89-538d-40f1-8aea-e397b0667ff9\") " pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.963616 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa605d89-538d-40f1-8aea-e397b0667ff9-utilities\") pod \"redhat-marketplace-76vrx\" (UID: \"aa605d89-538d-40f1-8aea-e397b0667ff9\") " pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.964362 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa605d89-538d-40f1-8aea-e397b0667ff9-utilities\") pod \"redhat-marketplace-76vrx\" (UID: \"aa605d89-538d-40f1-8aea-e397b0667ff9\") " pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.964482 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa605d89-538d-40f1-8aea-e397b0667ff9-catalog-content\") pod \"redhat-marketplace-76vrx\" (UID: \"aa605d89-538d-40f1-8aea-e397b0667ff9\") " pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:48:41 crc kubenswrapper[4715]: I1009 07:48:41.988255 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zdzk\" (UniqueName: \"kubernetes.io/projected/aa605d89-538d-40f1-8aea-e397b0667ff9-kube-api-access-4zdzk\") pod \"redhat-marketplace-76vrx\" (UID: \"aa605d89-538d-40f1-8aea-e397b0667ff9\") " pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.024031 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.077479 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.081240 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:42 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:42 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:42 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.081330 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.147125 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.255557 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.255861 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.264545 4715 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xh68m container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 09 07:48:42 crc kubenswrapper[4715]: [+]log ok Oct 09 07:48:42 crc kubenswrapper[4715]: [+]etcd ok Oct 09 07:48:42 crc kubenswrapper[4715]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 09 07:48:42 crc kubenswrapper[4715]: [+]poststarthook/generic-apiserver-start-informers ok Oct 09 07:48:42 crc kubenswrapper[4715]: [+]poststarthook/max-in-flight-filter ok Oct 09 07:48:42 crc kubenswrapper[4715]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 09 07:48:42 crc kubenswrapper[4715]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 09 07:48:42 crc kubenswrapper[4715]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 09 07:48:42 crc kubenswrapper[4715]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 09 07:48:42 crc kubenswrapper[4715]: [+]poststarthook/project.openshift.io-projectcache ok Oct 09 07:48:42 crc kubenswrapper[4715]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 09 07:48:42 crc kubenswrapper[4715]: [+]poststarthook/openshift.io-startinformers ok Oct 09 07:48:42 crc kubenswrapper[4715]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 09 07:48:42 crc kubenswrapper[4715]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 09 07:48:42 crc kubenswrapper[4715]: livez check failed Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.264591 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xh68m" podUID="3da117f6-b889-480f-b74b-5841bc551658" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.268232 4715 patch_prober.go:28] interesting pod/downloads-7954f5f757-sq956 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.268283 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-sq956" podUID="b75a6e94-9a8f-4789-be04-be1dabfc37c7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.270289 4715 patch_prober.go:28] interesting pod/downloads-7954f5f757-sq956 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.270330 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sq956" podUID="b75a6e94-9a8f-4789-be04-be1dabfc37c7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.286840 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76vrx"] Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.302719 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5njnt"] Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.307622 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:48:42 crc kubenswrapper[4715]: W1009 07:48:42.314963 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa605d89_538d_40f1_8aea_e397b0667ff9.slice/crio-24bfd38fd7078eb56f11a55aac23381f2bbc7a5397c01da2f14c6b28b4db894b WatchSource:0}: Error finding container 24bfd38fd7078eb56f11a55aac23381f2bbc7a5397c01da2f14c6b28b4db894b: Status 404 returned error can't find the container with id 24bfd38fd7078eb56f11a55aac23381f2bbc7a5397c01da2f14c6b28b4db894b Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.317160 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.342705 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5njnt"] Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.471389 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77b33330-3f7f-4bae-96c1-6558143109f2-catalog-content\") pod \"redhat-operators-5njnt\" (UID: \"77b33330-3f7f-4bae-96c1-6558143109f2\") " pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.471469 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77b33330-3f7f-4bae-96c1-6558143109f2-utilities\") pod \"redhat-operators-5njnt\" (UID: \"77b33330-3f7f-4bae-96c1-6558143109f2\") " pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.471603 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhv8c\" (UniqueName: \"kubernetes.io/projected/77b33330-3f7f-4bae-96c1-6558143109f2-kube-api-access-qhv8c\") pod \"redhat-operators-5njnt\" (UID: \"77b33330-3f7f-4bae-96c1-6558143109f2\") " pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.484676 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76vrx" event={"ID":"aa605d89-538d-40f1-8aea-e397b0667ff9","Type":"ContainerStarted","Data":"24bfd38fd7078eb56f11a55aac23381f2bbc7a5397c01da2f14c6b28b4db894b"} Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.494022 4715 generic.go:334] "Generic (PLEG): container finished" podID="fcafd988-9f86-4c66-8fa7-ead624b101a0" containerID="8d0bcc910edbf39872066b8e4cbd48854839d61c031d0d6563cdca4128e80b0b" exitCode=0 Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.494127 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgjp2" event={"ID":"fcafd988-9f86-4c66-8fa7-ead624b101a0","Type":"ContainerDied","Data":"8d0bcc910edbf39872066b8e4cbd48854839d61c031d0d6563cdca4128e80b0b"} Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.494196 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgjp2" event={"ID":"fcafd988-9f86-4c66-8fa7-ead624b101a0","Type":"ContainerStarted","Data":"3c92cb8fabe6c9319f79637e4a239ac745b9de0618e35e41618c6601dfe94f7f"} Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.497960 4715 generic.go:334] "Generic (PLEG): container finished" podID="cdfa926f-39ff-4fbf-8265-2ebf3bc796bb" containerID="c42313337baa86d793c42f3dd16e50c34d10d1a0d713378cc6c1d869914daf3d" exitCode=0 Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.498002 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb","Type":"ContainerDied","Data":"c42313337baa86d793c42f3dd16e50c34d10d1a0d713378cc6c1d869914daf3d"} Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.504226 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" event={"ID":"915f6370-d5b2-4c9e-a1b1-c3146612b3ce","Type":"ContainerStarted","Data":"2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c"} Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.533605 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" podStartSLOduration=131.533577843 podStartE2EDuration="2m11.533577843s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:42.530500654 +0000 UTC m=+153.223304682" watchObservedRunningTime="2025-10-09 07:48:42.533577843 +0000 UTC m=+153.226381851" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.540609 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.573136 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhv8c\" (UniqueName: \"kubernetes.io/projected/77b33330-3f7f-4bae-96c1-6558143109f2-kube-api-access-qhv8c\") pod \"redhat-operators-5njnt\" (UID: \"77b33330-3f7f-4bae-96c1-6558143109f2\") " pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.573233 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77b33330-3f7f-4bae-96c1-6558143109f2-catalog-content\") pod \"redhat-operators-5njnt\" (UID: \"77b33330-3f7f-4bae-96c1-6558143109f2\") " pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.573255 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77b33330-3f7f-4bae-96c1-6558143109f2-utilities\") pod \"redhat-operators-5njnt\" (UID: \"77b33330-3f7f-4bae-96c1-6558143109f2\") " pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.573727 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77b33330-3f7f-4bae-96c1-6558143109f2-utilities\") pod \"redhat-operators-5njnt\" (UID: \"77b33330-3f7f-4bae-96c1-6558143109f2\") " pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.574947 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77b33330-3f7f-4bae-96c1-6558143109f2-catalog-content\") pod \"redhat-operators-5njnt\" (UID: \"77b33330-3f7f-4bae-96c1-6558143109f2\") " pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.611910 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhv8c\" (UniqueName: \"kubernetes.io/projected/77b33330-3f7f-4bae-96c1-6558143109f2-kube-api-access-qhv8c\") pod \"redhat-operators-5njnt\" (UID: \"77b33330-3f7f-4bae-96c1-6558143109f2\") " pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.647514 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.696904 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-glnfd"] Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.700521 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.703392 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-glnfd"] Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.776232 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdhr8\" (UniqueName: \"kubernetes.io/projected/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-kube-api-access-gdhr8\") pod \"redhat-operators-glnfd\" (UID: \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\") " pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.776683 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-catalog-content\") pod \"redhat-operators-glnfd\" (UID: \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\") " pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.776715 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-utilities\") pod \"redhat-operators-glnfd\" (UID: \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\") " pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.877688 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-catalog-content\") pod \"redhat-operators-glnfd\" (UID: \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\") " pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.877735 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-utilities\") pod \"redhat-operators-glnfd\" (UID: \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\") " pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.877784 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdhr8\" (UniqueName: \"kubernetes.io/projected/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-kube-api-access-gdhr8\") pod \"redhat-operators-glnfd\" (UID: \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\") " pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.878347 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-catalog-content\") pod \"redhat-operators-glnfd\" (UID: \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\") " pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.881262 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-utilities\") pod \"redhat-operators-glnfd\" (UID: \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\") " pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.896672 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdhr8\" (UniqueName: \"kubernetes.io/projected/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-kube-api-access-gdhr8\") pod \"redhat-operators-glnfd\" (UID: \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\") " pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:48:42 crc kubenswrapper[4715]: I1009 07:48:42.937751 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5njnt"] Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.022597 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.082396 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:43 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:43 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:43 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.082918 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.432158 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.433144 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.434382 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.435781 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.441206 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.530827 4715 generic.go:334] "Generic (PLEG): container finished" podID="aa605d89-538d-40f1-8aea-e397b0667ff9" containerID="c9fda2d11e521a81d4fe6d308e5fc0cca944c8bfbf27acb5878e72c36f91a812" exitCode=0 Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.530961 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76vrx" event={"ID":"aa605d89-538d-40f1-8aea-e397b0667ff9","Type":"ContainerDied","Data":"c9fda2d11e521a81d4fe6d308e5fc0cca944c8bfbf27acb5878e72c36f91a812"} Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.542196 4715 generic.go:334] "Generic (PLEG): container finished" podID="77b33330-3f7f-4bae-96c1-6558143109f2" containerID="e24f7b935f43b984724c31319f2feed9313d37aee23fafe6b7c2e7d44c118b66" exitCode=0 Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.542577 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5njnt" event={"ID":"77b33330-3f7f-4bae-96c1-6558143109f2","Type":"ContainerDied","Data":"e24f7b935f43b984724c31319f2feed9313d37aee23fafe6b7c2e7d44c118b66"} Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.542667 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5njnt" event={"ID":"77b33330-3f7f-4bae-96c1-6558143109f2","Type":"ContainerStarted","Data":"a9025a13547ce8d208f04323eacdb28ee8c4459fe15a2bcf3c079a6030908717"} Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.543126 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.597903 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9da3b25-553e-4eb7-83b3-f5a74f6a1320-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f9da3b25-553e-4eb7-83b3-f5a74f6a1320\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.598029 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9da3b25-553e-4eb7-83b3-f5a74f6a1320-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f9da3b25-553e-4eb7-83b3-f5a74f6a1320\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.603913 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-glnfd"] Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.699767 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9da3b25-553e-4eb7-83b3-f5a74f6a1320-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f9da3b25-553e-4eb7-83b3-f5a74f6a1320\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.699902 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9da3b25-553e-4eb7-83b3-f5a74f6a1320-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f9da3b25-553e-4eb7-83b3-f5a74f6a1320\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.704280 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9da3b25-553e-4eb7-83b3-f5a74f6a1320-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f9da3b25-553e-4eb7-83b3-f5a74f6a1320\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.750454 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9da3b25-553e-4eb7-83b3-f5a74f6a1320-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f9da3b25-553e-4eb7-83b3-f5a74f6a1320\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.760448 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.814303 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.905888 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdfa926f-39ff-4fbf-8265-2ebf3bc796bb-kube-api-access\") pod \"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb\" (UID: \"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb\") " Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.905933 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdfa926f-39ff-4fbf-8265-2ebf3bc796bb-kubelet-dir\") pod \"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb\" (UID: \"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb\") " Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.906409 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdfa926f-39ff-4fbf-8265-2ebf3bc796bb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cdfa926f-39ff-4fbf-8265-2ebf3bc796bb" (UID: "cdfa926f-39ff-4fbf-8265-2ebf3bc796bb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:48:43 crc kubenswrapper[4715]: I1009 07:48:43.909732 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfa926f-39ff-4fbf-8265-2ebf3bc796bb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cdfa926f-39ff-4fbf-8265-2ebf3bc796bb" (UID: "cdfa926f-39ff-4fbf-8265-2ebf3bc796bb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:48:44 crc kubenswrapper[4715]: I1009 07:48:44.007564 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdfa926f-39ff-4fbf-8265-2ebf3bc796bb-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 07:48:44 crc kubenswrapper[4715]: I1009 07:48:44.007994 4715 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdfa926f-39ff-4fbf-8265-2ebf3bc796bb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 09 07:48:44 crc kubenswrapper[4715]: I1009 07:48:44.101622 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:44 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:44 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:44 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:44 crc kubenswrapper[4715]: I1009 07:48:44.101713 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:44 crc kubenswrapper[4715]: I1009 07:48:44.281179 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 09 07:48:44 crc kubenswrapper[4715]: W1009 07:48:44.303697 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf9da3b25_553e_4eb7_83b3_f5a74f6a1320.slice/crio-0755228f6eaea82c07bb8a7823c5f4111571139fafaf84a5876bb05f023ba5df WatchSource:0}: Error finding container 0755228f6eaea82c07bb8a7823c5f4111571139fafaf84a5876bb05f023ba5df: Status 404 returned error can't find the container with id 0755228f6eaea82c07bb8a7823c5f4111571139fafaf84a5876bb05f023ba5df Oct 09 07:48:44 crc kubenswrapper[4715]: I1009 07:48:44.570433 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cdfa926f-39ff-4fbf-8265-2ebf3bc796bb","Type":"ContainerDied","Data":"fdae42b9dd57a27434d01ffba31d58dd9126d016710cd502f47b375ae8bd2d4b"} Oct 09 07:48:44 crc kubenswrapper[4715]: I1009 07:48:44.570475 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdae42b9dd57a27434d01ffba31d58dd9126d016710cd502f47b375ae8bd2d4b" Oct 09 07:48:44 crc kubenswrapper[4715]: I1009 07:48:44.570518 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 09 07:48:44 crc kubenswrapper[4715]: I1009 07:48:44.599393 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f9da3b25-553e-4eb7-83b3-f5a74f6a1320","Type":"ContainerStarted","Data":"0755228f6eaea82c07bb8a7823c5f4111571139fafaf84a5876bb05f023ba5df"} Oct 09 07:48:44 crc kubenswrapper[4715]: I1009 07:48:44.608138 4715 generic.go:334] "Generic (PLEG): container finished" podID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" containerID="425999451b01930dbf828a7a7070710b154c47fa846e019540201f744489ef64" exitCode=0 Oct 09 07:48:44 crc kubenswrapper[4715]: I1009 07:48:44.608260 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glnfd" event={"ID":"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa","Type":"ContainerDied","Data":"425999451b01930dbf828a7a7070710b154c47fa846e019540201f744489ef64"} Oct 09 07:48:44 crc kubenswrapper[4715]: I1009 07:48:44.608304 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glnfd" event={"ID":"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa","Type":"ContainerStarted","Data":"34afeb0176671510fd6128d76d2866aac25541405a95a15469f24f63469c4008"} Oct 09 07:48:45 crc kubenswrapper[4715]: I1009 07:48:45.081825 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:45 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:45 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:45 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:45 crc kubenswrapper[4715]: I1009 07:48:45.081916 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:45 crc kubenswrapper[4715]: I1009 07:48:45.630014 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f9da3b25-553e-4eb7-83b3-f5a74f6a1320","Type":"ContainerStarted","Data":"b78af300805386336dba7e35d0c03739fafeada1a97528204afc8e52948cf85b"} Oct 09 07:48:45 crc kubenswrapper[4715]: I1009 07:48:45.651684 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.651659585 podStartE2EDuration="2.651659585s" podCreationTimestamp="2025-10-09 07:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:48:45.651213972 +0000 UTC m=+156.344017980" watchObservedRunningTime="2025-10-09 07:48:45.651659585 +0000 UTC m=+156.344463593" Oct 09 07:48:46 crc kubenswrapper[4715]: I1009 07:48:46.081008 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:46 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:46 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:46 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:46 crc kubenswrapper[4715]: I1009 07:48:46.081379 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:46 crc kubenswrapper[4715]: I1009 07:48:46.645159 4715 generic.go:334] "Generic (PLEG): container finished" podID="f9da3b25-553e-4eb7-83b3-f5a74f6a1320" containerID="b78af300805386336dba7e35d0c03739fafeada1a97528204afc8e52948cf85b" exitCode=0 Oct 09 07:48:46 crc kubenswrapper[4715]: I1009 07:48:46.645242 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f9da3b25-553e-4eb7-83b3-f5a74f6a1320","Type":"ContainerDied","Data":"b78af300805386336dba7e35d0c03739fafeada1a97528204afc8e52948cf85b"} Oct 09 07:48:46 crc kubenswrapper[4715]: I1009 07:48:46.753953 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 07:48:46 crc kubenswrapper[4715]: I1009 07:48:46.754071 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 07:48:47 crc kubenswrapper[4715]: I1009 07:48:47.080741 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:47 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:47 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:47 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:47 crc kubenswrapper[4715]: I1009 07:48:47.080814 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:47 crc kubenswrapper[4715]: I1009 07:48:47.260016 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:47 crc kubenswrapper[4715]: I1009 07:48:47.265324 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xh68m" Oct 09 07:48:47 crc kubenswrapper[4715]: I1009 07:48:47.577319 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7wthp" Oct 09 07:48:48 crc kubenswrapper[4715]: I1009 07:48:48.081621 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:48 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:48 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:48 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:48 crc kubenswrapper[4715]: I1009 07:48:48.082205 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:49 crc kubenswrapper[4715]: I1009 07:48:49.081357 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:49 crc kubenswrapper[4715]: [-]has-synced failed: reason withheld Oct 09 07:48:49 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:49 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:49 crc kubenswrapper[4715]: I1009 07:48:49.081457 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:50 crc kubenswrapper[4715]: I1009 07:48:50.079929 4715 patch_prober.go:28] interesting pod/router-default-5444994796-b4lqp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 09 07:48:50 crc kubenswrapper[4715]: [+]has-synced ok Oct 09 07:48:50 crc kubenswrapper[4715]: [+]process-running ok Oct 09 07:48:50 crc kubenswrapper[4715]: healthz check failed Oct 09 07:48:50 crc kubenswrapper[4715]: I1009 07:48:50.080002 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-b4lqp" podUID="18a93eea-f768-41ac-ae21-1d29a90f5f66" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 09 07:48:51 crc kubenswrapper[4715]: I1009 07:48:51.080283 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:51 crc kubenswrapper[4715]: I1009 07:48:51.083618 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-b4lqp" Oct 09 07:48:51 crc kubenswrapper[4715]: I1009 07:48:51.622749 4715 patch_prober.go:28] interesting pod/console-f9d7485db-5fdhg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 09 07:48:51 crc kubenswrapper[4715]: I1009 07:48:51.622833 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-5fdhg" podUID="3c1c9983-60a8-4db2-866c-15deb7220cb9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 09 07:48:51 crc kubenswrapper[4715]: I1009 07:48:51.877975 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:48:52 crc kubenswrapper[4715]: I1009 07:48:52.268531 4715 patch_prober.go:28] interesting pod/downloads-7954f5f757-sq956 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 09 07:48:52 crc kubenswrapper[4715]: I1009 07:48:52.268594 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sq956" podUID="b75a6e94-9a8f-4789-be04-be1dabfc37c7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 09 07:48:52 crc kubenswrapper[4715]: I1009 07:48:52.268622 4715 patch_prober.go:28] interesting pod/downloads-7954f5f757-sq956 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 09 07:48:52 crc kubenswrapper[4715]: I1009 07:48:52.268678 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-sq956" podUID="b75a6e94-9a8f-4789-be04-be1dabfc37c7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 09 07:48:53 crc kubenswrapper[4715]: I1009 07:48:53.502147 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:53 crc kubenswrapper[4715]: I1009 07:48:53.510959 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a8fb3b8-b254-4bc3-b105-990eac79c77b-metrics-certs\") pod \"network-metrics-daemon-fm6s2\" (UID: \"9a8fb3b8-b254-4bc3-b105-990eac79c77b\") " pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:53 crc kubenswrapper[4715]: I1009 07:48:53.758192 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fm6s2" Oct 09 07:48:54 crc kubenswrapper[4715]: I1009 07:48:54.075518 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 07:48:54 crc kubenswrapper[4715]: I1009 07:48:54.215153 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9da3b25-553e-4eb7-83b3-f5a74f6a1320-kube-api-access\") pod \"f9da3b25-553e-4eb7-83b3-f5a74f6a1320\" (UID: \"f9da3b25-553e-4eb7-83b3-f5a74f6a1320\") " Oct 09 07:48:54 crc kubenswrapper[4715]: I1009 07:48:54.215338 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9da3b25-553e-4eb7-83b3-f5a74f6a1320-kubelet-dir\") pod \"f9da3b25-553e-4eb7-83b3-f5a74f6a1320\" (UID: \"f9da3b25-553e-4eb7-83b3-f5a74f6a1320\") " Oct 09 07:48:54 crc kubenswrapper[4715]: I1009 07:48:54.215596 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9da3b25-553e-4eb7-83b3-f5a74f6a1320-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f9da3b25-553e-4eb7-83b3-f5a74f6a1320" (UID: "f9da3b25-553e-4eb7-83b3-f5a74f6a1320"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:48:54 crc kubenswrapper[4715]: I1009 07:48:54.216847 4715 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f9da3b25-553e-4eb7-83b3-f5a74f6a1320-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 09 07:48:54 crc kubenswrapper[4715]: I1009 07:48:54.219132 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9da3b25-553e-4eb7-83b3-f5a74f6a1320-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f9da3b25-553e-4eb7-83b3-f5a74f6a1320" (UID: "f9da3b25-553e-4eb7-83b3-f5a74f6a1320"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:48:54 crc kubenswrapper[4715]: I1009 07:48:54.318089 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9da3b25-553e-4eb7-83b3-f5a74f6a1320-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 09 07:48:54 crc kubenswrapper[4715]: I1009 07:48:54.722676 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f9da3b25-553e-4eb7-83b3-f5a74f6a1320","Type":"ContainerDied","Data":"0755228f6eaea82c07bb8a7823c5f4111571139fafaf84a5876bb05f023ba5df"} Oct 09 07:48:54 crc kubenswrapper[4715]: I1009 07:48:54.722988 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0755228f6eaea82c07bb8a7823c5f4111571139fafaf84a5876bb05f023ba5df" Oct 09 07:48:54 crc kubenswrapper[4715]: I1009 07:48:54.723061 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 09 07:49:01 crc kubenswrapper[4715]: I1009 07:49:01.094166 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:49:01 crc kubenswrapper[4715]: I1009 07:49:01.627537 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:49:01 crc kubenswrapper[4715]: I1009 07:49:01.631447 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:49:02 crc kubenswrapper[4715]: I1009 07:49:02.272470 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-sq956" Oct 09 07:49:12 crc kubenswrapper[4715]: I1009 07:49:12.435967 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-sxdhp" Oct 09 07:49:16 crc kubenswrapper[4715]: I1009 07:49:16.753819 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 07:49:16 crc kubenswrapper[4715]: I1009 07:49:16.754258 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 07:49:19 crc kubenswrapper[4715]: I1009 07:49:19.377307 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 09 07:49:19 crc kubenswrapper[4715]: E1009 07:49:19.554696 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 09 07:49:19 crc kubenswrapper[4715]: E1009 07:49:19.554927 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vs9sm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xqkhk_openshift-marketplace(4e13129f-063f-400f-b483-537273d66d74): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 07:49:19 crc kubenswrapper[4715]: E1009 07:49:19.556130 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-xqkhk" podUID="4e13129f-063f-400f-b483-537273d66d74" Oct 09 07:49:22 crc kubenswrapper[4715]: E1009 07:49:22.218742 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xqkhk" podUID="4e13129f-063f-400f-b483-537273d66d74" Oct 09 07:49:26 crc kubenswrapper[4715]: E1009 07:49:26.251116 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 09 07:49:26 crc kubenswrapper[4715]: E1009 07:49:26.251368 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gdhr8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-glnfd_openshift-marketplace(86ea1b53-8025-4b37-9b58-a1a03ddbbfaa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 07:49:26 crc kubenswrapper[4715]: E1009 07:49:26.252854 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-glnfd" podUID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" Oct 09 07:49:43 crc kubenswrapper[4715]: E1009 07:49:43.833961 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 09 07:49:43 crc kubenswrapper[4715]: E1009 07:49:43.834517 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wqvs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vgjp2_openshift-marketplace(fcafd988-9f86-4c66-8fa7-ead624b101a0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 07:49:43 crc kubenswrapper[4715]: E1009 07:49:43.900733 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vgjp2" podUID="fcafd988-9f86-4c66-8fa7-ead624b101a0" Oct 09 07:49:44 crc kubenswrapper[4715]: E1009 07:49:44.052099 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vgjp2" podUID="fcafd988-9f86-4c66-8fa7-ead624b101a0" Oct 09 07:49:44 crc kubenswrapper[4715]: I1009 07:49:44.212640 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fm6s2"] Oct 09 07:49:44 crc kubenswrapper[4715]: W1009 07:49:44.218500 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a8fb3b8_b254_4bc3_b105_990eac79c77b.slice/crio-a501f9ab72425dfaabc1d43174292600681a2211436ddbbd80b31ddc68853d14 WatchSource:0}: Error finding container a501f9ab72425dfaabc1d43174292600681a2211436ddbbd80b31ddc68853d14: Status 404 returned error can't find the container with id a501f9ab72425dfaabc1d43174292600681a2211436ddbbd80b31ddc68853d14 Oct 09 07:49:45 crc kubenswrapper[4715]: I1009 07:49:45.056552 4715 generic.go:334] "Generic (PLEG): container finished" podID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" containerID="85ced30e17aadf4d7bfbd0d2948c385586622483cbc166902b93f3374a478114" exitCode=0 Oct 09 07:49:45 crc kubenswrapper[4715]: I1009 07:49:45.056691 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kpjv" event={"ID":"8ba28cf2-dac1-47a3-9efe-727e793c7afd","Type":"ContainerDied","Data":"85ced30e17aadf4d7bfbd0d2948c385586622483cbc166902b93f3374a478114"} Oct 09 07:49:45 crc kubenswrapper[4715]: I1009 07:49:45.058351 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" event={"ID":"9a8fb3b8-b254-4bc3-b105-990eac79c77b","Type":"ContainerStarted","Data":"a501f9ab72425dfaabc1d43174292600681a2211436ddbbd80b31ddc68853d14"} Oct 09 07:49:46 crc kubenswrapper[4715]: I1009 07:49:46.754028 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 07:49:46 crc kubenswrapper[4715]: I1009 07:49:46.754179 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 07:49:46 crc kubenswrapper[4715]: I1009 07:49:46.754263 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:49:46 crc kubenswrapper[4715]: I1009 07:49:46.755269 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 07:49:46 crc kubenswrapper[4715]: I1009 07:49:46.755484 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94" gracePeriod=600 Oct 09 07:49:46 crc kubenswrapper[4715]: E1009 07:49:46.992717 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 09 07:49:46 crc kubenswrapper[4715]: E1009 07:49:46.993597 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-968hh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7f9jj_openshift-marketplace(fa4e27ff-70ea-4095-b26a-e787d60bf751): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 07:49:46 crc kubenswrapper[4715]: E1009 07:49:46.995105 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7f9jj" podUID="fa4e27ff-70ea-4095-b26a-e787d60bf751" Oct 09 07:49:47 crc kubenswrapper[4715]: I1009 07:49:47.081208 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" event={"ID":"9a8fb3b8-b254-4bc3-b105-990eac79c77b","Type":"ContainerStarted","Data":"ec84b3dbe83e0adbb1d04224cb95f06db417ac3d4a8d06d66b14661d59642ab6"} Oct 09 07:49:48 crc kubenswrapper[4715]: E1009 07:49:48.898415 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 09 07:49:48 crc kubenswrapper[4715]: E1009 07:49:48.898719 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qhv8c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5njnt_openshift-marketplace(77b33330-3f7f-4bae-96c1-6558143109f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 07:49:48 crc kubenswrapper[4715]: E1009 07:49:48.899965 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5njnt" podUID="77b33330-3f7f-4bae-96c1-6558143109f2" Oct 09 07:49:48 crc kubenswrapper[4715]: E1009 07:49:48.916997 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 09 07:49:48 crc kubenswrapper[4715]: E1009 07:49:48.917194 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvl7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7v8fg_openshift-marketplace(856c1393-8838-4544-9769-a1055c252169): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 07:49:48 crc kubenswrapper[4715]: E1009 07:49:48.919312 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7v8fg" podUID="856c1393-8838-4544-9769-a1055c252169" Oct 09 07:49:49 crc kubenswrapper[4715]: I1009 07:49:49.095258 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94" exitCode=0 Oct 09 07:49:49 crc kubenswrapper[4715]: I1009 07:49:49.095345 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94"} Oct 09 07:49:53 crc kubenswrapper[4715]: E1009 07:49:53.122219 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 09 07:49:53 crc kubenswrapper[4715]: E1009 07:49:53.123021 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4zdzk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-76vrx_openshift-marketplace(aa605d89-538d-40f1-8aea-e397b0667ff9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 09 07:49:53 crc kubenswrapper[4715]: E1009 07:49:53.124290 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-76vrx" podUID="aa605d89-538d-40f1-8aea-e397b0667ff9" Oct 09 07:49:53 crc kubenswrapper[4715]: E1009 07:49:53.640976 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5njnt" podUID="77b33330-3f7f-4bae-96c1-6558143109f2" Oct 09 07:49:53 crc kubenswrapper[4715]: E1009 07:49:53.641587 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7v8fg" podUID="856c1393-8838-4544-9769-a1055c252169" Oct 09 07:49:53 crc kubenswrapper[4715]: E1009 07:49:53.642028 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7f9jj" podUID="fa4e27ff-70ea-4095-b26a-e787d60bf751" Oct 09 07:49:55 crc kubenswrapper[4715]: E1009 07:49:55.833691 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-76vrx" podUID="aa605d89-538d-40f1-8aea-e397b0667ff9" Oct 09 07:49:57 crc kubenswrapper[4715]: I1009 07:49:57.151106 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kpjv" event={"ID":"8ba28cf2-dac1-47a3-9efe-727e793c7afd","Type":"ContainerStarted","Data":"88ba48ad03f3927426b92e850bf8c38b6acfa5076feb46eb783cf7a6184b8b2f"} Oct 09 07:49:57 crc kubenswrapper[4715]: I1009 07:49:57.152827 4715 generic.go:334] "Generic (PLEG): container finished" podID="4e13129f-063f-400f-b483-537273d66d74" containerID="82de1add9382f5967971a4d58b3b56863a31d7aa54143cc80fb657399f530fa1" exitCode=0 Oct 09 07:49:57 crc kubenswrapper[4715]: I1009 07:49:57.152888 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xqkhk" event={"ID":"4e13129f-063f-400f-b483-537273d66d74","Type":"ContainerDied","Data":"82de1add9382f5967971a4d58b3b56863a31d7aa54143cc80fb657399f530fa1"} Oct 09 07:49:57 crc kubenswrapper[4715]: I1009 07:49:57.155889 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fm6s2" event={"ID":"9a8fb3b8-b254-4bc3-b105-990eac79c77b","Type":"ContainerStarted","Data":"56c68351e1f34daf9b5a81d4d50a4ca688511055b9a8dbf7968f25bf11fec633"} Oct 09 07:49:57 crc kubenswrapper[4715]: I1009 07:49:57.159734 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"b010400ee7dba57a3343bec5cd3be68030f4519bc1714b489d54ec14a33cc803"} Oct 09 07:49:57 crc kubenswrapper[4715]: I1009 07:49:57.162446 4715 generic.go:334] "Generic (PLEG): container finished" podID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" containerID="747718daf3d86635690916c18edb907fb9b3302601c63eaaf76b80636a6125ed" exitCode=0 Oct 09 07:49:57 crc kubenswrapper[4715]: I1009 07:49:57.162477 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glnfd" event={"ID":"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa","Type":"ContainerDied","Data":"747718daf3d86635690916c18edb907fb9b3302601c63eaaf76b80636a6125ed"} Oct 09 07:49:57 crc kubenswrapper[4715]: I1009 07:49:57.175477 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6kpjv" podStartSLOduration=2.4793182590000002 podStartE2EDuration="1m18.175459837s" podCreationTimestamp="2025-10-09 07:48:39 +0000 UTC" firstStartedPulling="2025-10-09 07:48:40.419295082 +0000 UTC m=+151.112099090" lastFinishedPulling="2025-10-09 07:49:56.11543666 +0000 UTC m=+226.808240668" observedRunningTime="2025-10-09 07:49:57.173766366 +0000 UTC m=+227.866570394" watchObservedRunningTime="2025-10-09 07:49:57.175459837 +0000 UTC m=+227.868263845" Oct 09 07:49:57 crc kubenswrapper[4715]: I1009 07:49:57.201033 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fm6s2" podStartSLOduration=206.201007982 podStartE2EDuration="3m26.201007982s" podCreationTimestamp="2025-10-09 07:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:49:57.195455954 +0000 UTC m=+227.888259972" watchObservedRunningTime="2025-10-09 07:49:57.201007982 +0000 UTC m=+227.893812000" Oct 09 07:49:59 crc kubenswrapper[4715]: I1009 07:49:59.662124 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:49:59 crc kubenswrapper[4715]: I1009 07:49:59.663811 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:50:01 crc kubenswrapper[4715]: I1009 07:50:01.061970 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6kpjv" podUID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" containerName="registry-server" probeResult="failure" output=< Oct 09 07:50:01 crc kubenswrapper[4715]: timeout: failed to connect service ":50051" within 1s Oct 09 07:50:01 crc kubenswrapper[4715]: > Oct 09 07:50:05 crc kubenswrapper[4715]: I1009 07:50:05.216488 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glnfd" event={"ID":"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa","Type":"ContainerStarted","Data":"64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09"} Oct 09 07:50:06 crc kubenswrapper[4715]: I1009 07:50:06.223980 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xqkhk" event={"ID":"4e13129f-063f-400f-b483-537273d66d74","Type":"ContainerStarted","Data":"8d15dd648f2957c53eb0f27a4aee79622f359088743a80716816ee672033c9a3"} Oct 09 07:50:06 crc kubenswrapper[4715]: I1009 07:50:06.225896 4715 generic.go:334] "Generic (PLEG): container finished" podID="fcafd988-9f86-4c66-8fa7-ead624b101a0" containerID="3c2a0dd3159768b16ad47f45543bf3ff2111c3881c561ca93d6f414ec3d85096" exitCode=0 Oct 09 07:50:06 crc kubenswrapper[4715]: I1009 07:50:06.226011 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgjp2" event={"ID":"fcafd988-9f86-4c66-8fa7-ead624b101a0","Type":"ContainerDied","Data":"3c2a0dd3159768b16ad47f45543bf3ff2111c3881c561ca93d6f414ec3d85096"} Oct 09 07:50:06 crc kubenswrapper[4715]: I1009 07:50:06.251105 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xqkhk" podStartSLOduration=6.033337861 podStartE2EDuration="1m27.251080381s" podCreationTimestamp="2025-10-09 07:48:39 +0000 UTC" firstStartedPulling="2025-10-09 07:48:40.419535779 +0000 UTC m=+151.112339787" lastFinishedPulling="2025-10-09 07:50:01.637278299 +0000 UTC m=+232.330082307" observedRunningTime="2025-10-09 07:50:06.246797291 +0000 UTC m=+236.939601299" watchObservedRunningTime="2025-10-09 07:50:06.251080381 +0000 UTC m=+236.943884389" Oct 09 07:50:06 crc kubenswrapper[4715]: I1009 07:50:06.322586 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-glnfd" podStartSLOduration=7.295406912 podStartE2EDuration="1m24.322567449s" podCreationTimestamp="2025-10-09 07:48:42 +0000 UTC" firstStartedPulling="2025-10-09 07:48:44.612126032 +0000 UTC m=+155.304930040" lastFinishedPulling="2025-10-09 07:50:01.639286569 +0000 UTC m=+232.332090577" observedRunningTime="2025-10-09 07:50:06.317753053 +0000 UTC m=+237.010557061" watchObservedRunningTime="2025-10-09 07:50:06.322567449 +0000 UTC m=+237.015371457" Oct 09 07:50:09 crc kubenswrapper[4715]: I1009 07:50:09.465985 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:50:09 crc kubenswrapper[4715]: I1009 07:50:09.466379 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:50:09 crc kubenswrapper[4715]: I1009 07:50:09.915062 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:50:09 crc kubenswrapper[4715]: I1009 07:50:09.916706 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:50:09 crc kubenswrapper[4715]: I1009 07:50:09.967938 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:50:10 crc kubenswrapper[4715]: I1009 07:50:10.292008 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:50:13 crc kubenswrapper[4715]: I1009 07:50:13.023736 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:50:13 crc kubenswrapper[4715]: I1009 07:50:13.024254 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:50:13 crc kubenswrapper[4715]: I1009 07:50:13.069667 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:50:13 crc kubenswrapper[4715]: I1009 07:50:13.316815 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.280237 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgjp2" event={"ID":"fcafd988-9f86-4c66-8fa7-ead624b101a0","Type":"ContainerStarted","Data":"083d3650e43b7aa974b486d20c238ed42790835ba3d96cab2f028693f8def3e8"} Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.283513 4715 generic.go:334] "Generic (PLEG): container finished" podID="fa4e27ff-70ea-4095-b26a-e787d60bf751" containerID="70aad7d05991a3f0d0755418c97419a4bb46e4ecffe8c6a40a66816c1806b7e9" exitCode=0 Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.283625 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f9jj" event={"ID":"fa4e27ff-70ea-4095-b26a-e787d60bf751","Type":"ContainerDied","Data":"70aad7d05991a3f0d0755418c97419a4bb46e4ecffe8c6a40a66816c1806b7e9"} Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.287900 4715 generic.go:334] "Generic (PLEG): container finished" podID="856c1393-8838-4544-9769-a1055c252169" containerID="2834732226a40c99e19f1df8abdf840c6bbfced2d34d1ee37d8891c1afd35207" exitCode=0 Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.287995 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7v8fg" event={"ID":"856c1393-8838-4544-9769-a1055c252169","Type":"ContainerDied","Data":"2834732226a40c99e19f1df8abdf840c6bbfced2d34d1ee37d8891c1afd35207"} Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.294090 4715 generic.go:334] "Generic (PLEG): container finished" podID="77b33330-3f7f-4bae-96c1-6558143109f2" containerID="dce4bd73a6b71b68dfe32b20d46ef06afc024e40d582aa447f26c7d21c37af2c" exitCode=0 Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.294164 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5njnt" event={"ID":"77b33330-3f7f-4bae-96c1-6558143109f2","Type":"ContainerDied","Data":"dce4bd73a6b71b68dfe32b20d46ef06afc024e40d582aa447f26c7d21c37af2c"} Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.312745 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vgjp2" podStartSLOduration=2.422562048 podStartE2EDuration="1m34.3127226s" podCreationTimestamp="2025-10-09 07:48:41 +0000 UTC" firstStartedPulling="2025-10-09 07:48:42.49570631 +0000 UTC m=+153.188510328" lastFinishedPulling="2025-10-09 07:50:14.385866842 +0000 UTC m=+245.078670880" observedRunningTime="2025-10-09 07:50:15.307384258 +0000 UTC m=+246.000188286" watchObservedRunningTime="2025-10-09 07:50:15.3127226 +0000 UTC m=+246.005526608" Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.377367 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-glnfd"] Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.377878 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-glnfd" podUID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" containerName="registry-server" containerID="cri-o://64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09" gracePeriod=2 Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.771197 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.889746 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-catalog-content\") pod \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\" (UID: \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\") " Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.889804 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdhr8\" (UniqueName: \"kubernetes.io/projected/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-kube-api-access-gdhr8\") pod \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\" (UID: \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\") " Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.889869 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-utilities\") pod \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\" (UID: \"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa\") " Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.891915 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-utilities" (OuterVolumeSpecName: "utilities") pod "86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" (UID: "86ea1b53-8025-4b37-9b58-a1a03ddbbfaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.898125 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-kube-api-access-gdhr8" (OuterVolumeSpecName: "kube-api-access-gdhr8") pod "86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" (UID: "86ea1b53-8025-4b37-9b58-a1a03ddbbfaa"). InnerVolumeSpecName "kube-api-access-gdhr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.989176 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" (UID: "86ea1b53-8025-4b37-9b58-a1a03ddbbfaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.991719 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdhr8\" (UniqueName: \"kubernetes.io/projected/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-kube-api-access-gdhr8\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.991750 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:15 crc kubenswrapper[4715]: I1009 07:50:15.991763 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.300331 4715 generic.go:334] "Generic (PLEG): container finished" podID="aa605d89-538d-40f1-8aea-e397b0667ff9" containerID="07e0edca48e6107bb47b8fb440e9999f0be140ef9c9ccbff3916e3bb9852f3dd" exitCode=0 Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.300413 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76vrx" event={"ID":"aa605d89-538d-40f1-8aea-e397b0667ff9","Type":"ContainerDied","Data":"07e0edca48e6107bb47b8fb440e9999f0be140ef9c9ccbff3916e3bb9852f3dd"} Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.302469 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5njnt" event={"ID":"77b33330-3f7f-4bae-96c1-6558143109f2","Type":"ContainerStarted","Data":"35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c"} Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.316252 4715 generic.go:334] "Generic (PLEG): container finished" podID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" containerID="64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09" exitCode=0 Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.316574 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glnfd" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.318461 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glnfd" event={"ID":"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa","Type":"ContainerDied","Data":"64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09"} Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.318518 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glnfd" event={"ID":"86ea1b53-8025-4b37-9b58-a1a03ddbbfaa","Type":"ContainerDied","Data":"34afeb0176671510fd6128d76d2866aac25541405a95a15469f24f63469c4008"} Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.318560 4715 scope.go:117] "RemoveContainer" containerID="64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.319094 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f9jj" event={"ID":"fa4e27ff-70ea-4095-b26a-e787d60bf751","Type":"ContainerStarted","Data":"38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408"} Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.321827 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7v8fg" event={"ID":"856c1393-8838-4544-9769-a1055c252169","Type":"ContainerStarted","Data":"af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d"} Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.338272 4715 scope.go:117] "RemoveContainer" containerID="747718daf3d86635690916c18edb907fb9b3302601c63eaaf76b80636a6125ed" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.353620 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7v8fg" podStartSLOduration=2.993609859 podStartE2EDuration="1m37.353573855s" podCreationTimestamp="2025-10-09 07:48:39 +0000 UTC" firstStartedPulling="2025-10-09 07:48:41.466363524 +0000 UTC m=+152.159167532" lastFinishedPulling="2025-10-09 07:50:15.82632752 +0000 UTC m=+246.519131528" observedRunningTime="2025-10-09 07:50:16.348200992 +0000 UTC m=+247.041005000" watchObservedRunningTime="2025-10-09 07:50:16.353573855 +0000 UTC m=+247.046377863" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.360399 4715 scope.go:117] "RemoveContainer" containerID="425999451b01930dbf828a7a7070710b154c47fa846e019540201f744489ef64" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.370308 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7f9jj" podStartSLOduration=3.108519753 podStartE2EDuration="1m37.370285402s" podCreationTimestamp="2025-10-09 07:48:39 +0000 UTC" firstStartedPulling="2025-10-09 07:48:41.459866785 +0000 UTC m=+152.152670793" lastFinishedPulling="2025-10-09 07:50:15.721632434 +0000 UTC m=+246.414436442" observedRunningTime="2025-10-09 07:50:16.367538009 +0000 UTC m=+247.060342017" watchObservedRunningTime="2025-10-09 07:50:16.370285402 +0000 UTC m=+247.063089410" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.383292 4715 scope.go:117] "RemoveContainer" containerID="64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09" Oct 09 07:50:16 crc kubenswrapper[4715]: E1009 07:50:16.386371 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09\": container with ID starting with 64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09 not found: ID does not exist" containerID="64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.386450 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09"} err="failed to get container status \"64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09\": rpc error: code = NotFound desc = could not find container \"64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09\": container with ID starting with 64beed5c5fec87b7da10a24a9eaad2e59ae678ac14abe98e1779d13aa0e8de09 not found: ID does not exist" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.386491 4715 scope.go:117] "RemoveContainer" containerID="747718daf3d86635690916c18edb907fb9b3302601c63eaaf76b80636a6125ed" Oct 09 07:50:16 crc kubenswrapper[4715]: E1009 07:50:16.388986 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"747718daf3d86635690916c18edb907fb9b3302601c63eaaf76b80636a6125ed\": container with ID starting with 747718daf3d86635690916c18edb907fb9b3302601c63eaaf76b80636a6125ed not found: ID does not exist" containerID="747718daf3d86635690916c18edb907fb9b3302601c63eaaf76b80636a6125ed" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.389119 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"747718daf3d86635690916c18edb907fb9b3302601c63eaaf76b80636a6125ed"} err="failed to get container status \"747718daf3d86635690916c18edb907fb9b3302601c63eaaf76b80636a6125ed\": rpc error: code = NotFound desc = could not find container \"747718daf3d86635690916c18edb907fb9b3302601c63eaaf76b80636a6125ed\": container with ID starting with 747718daf3d86635690916c18edb907fb9b3302601c63eaaf76b80636a6125ed not found: ID does not exist" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.389215 4715 scope.go:117] "RemoveContainer" containerID="425999451b01930dbf828a7a7070710b154c47fa846e019540201f744489ef64" Oct 09 07:50:16 crc kubenswrapper[4715]: E1009 07:50:16.389723 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"425999451b01930dbf828a7a7070710b154c47fa846e019540201f744489ef64\": container with ID starting with 425999451b01930dbf828a7a7070710b154c47fa846e019540201f744489ef64 not found: ID does not exist" containerID="425999451b01930dbf828a7a7070710b154c47fa846e019540201f744489ef64" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.389775 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425999451b01930dbf828a7a7070710b154c47fa846e019540201f744489ef64"} err="failed to get container status \"425999451b01930dbf828a7a7070710b154c47fa846e019540201f744489ef64\": rpc error: code = NotFound desc = could not find container \"425999451b01930dbf828a7a7070710b154c47fa846e019540201f744489ef64\": container with ID starting with 425999451b01930dbf828a7a7070710b154c47fa846e019540201f744489ef64 not found: ID does not exist" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.404211 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5njnt" podStartSLOduration=2.218773555 podStartE2EDuration="1m34.404139589s" podCreationTimestamp="2025-10-09 07:48:42 +0000 UTC" firstStartedPulling="2025-10-09 07:48:43.54613819 +0000 UTC m=+154.238942198" lastFinishedPulling="2025-10-09 07:50:15.731504224 +0000 UTC m=+246.424308232" observedRunningTime="2025-10-09 07:50:16.389310789 +0000 UTC m=+247.082114837" watchObservedRunningTime="2025-10-09 07:50:16.404139589 +0000 UTC m=+247.096943597" Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.407777 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-glnfd"] Oct 09 07:50:16 crc kubenswrapper[4715]: I1009 07:50:16.412010 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-glnfd"] Oct 09 07:50:17 crc kubenswrapper[4715]: I1009 07:50:17.330315 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76vrx" event={"ID":"aa605d89-538d-40f1-8aea-e397b0667ff9","Type":"ContainerStarted","Data":"8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7"} Oct 09 07:50:17 crc kubenswrapper[4715]: I1009 07:50:17.349769 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-76vrx" podStartSLOduration=3.183626007 podStartE2EDuration="1m36.349748874s" podCreationTimestamp="2025-10-09 07:48:41 +0000 UTC" firstStartedPulling="2025-10-09 07:48:43.53826504 +0000 UTC m=+154.231069048" lastFinishedPulling="2025-10-09 07:50:16.704387907 +0000 UTC m=+247.397191915" observedRunningTime="2025-10-09 07:50:17.3486404 +0000 UTC m=+248.041444408" watchObservedRunningTime="2025-10-09 07:50:17.349748874 +0000 UTC m=+248.042552882" Oct 09 07:50:18 crc kubenswrapper[4715]: I1009 07:50:18.156930 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" path="/var/lib/kubelet/pods/86ea1b53-8025-4b37-9b58-a1a03ddbbfaa/volumes" Oct 09 07:50:19 crc kubenswrapper[4715]: I1009 07:50:19.831390 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:50:19 crc kubenswrapper[4715]: I1009 07:50:19.831778 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:50:19 crc kubenswrapper[4715]: I1009 07:50:19.899958 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:50:20 crc kubenswrapper[4715]: I1009 07:50:20.063525 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:50:20 crc kubenswrapper[4715]: I1009 07:50:20.063682 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:50:20 crc kubenswrapper[4715]: I1009 07:50:20.117099 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:50:20 crc kubenswrapper[4715]: I1009 07:50:20.429808 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:50:20 crc kubenswrapper[4715]: I1009 07:50:20.442687 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:50:21 crc kubenswrapper[4715]: I1009 07:50:21.623483 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:50:21 crc kubenswrapper[4715]: I1009 07:50:21.623846 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:50:21 crc kubenswrapper[4715]: I1009 07:50:21.687432 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:50:21 crc kubenswrapper[4715]: I1009 07:50:21.774517 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7v8fg"] Oct 09 07:50:22 crc kubenswrapper[4715]: I1009 07:50:22.024662 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:50:22 crc kubenswrapper[4715]: I1009 07:50:22.025073 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:50:22 crc kubenswrapper[4715]: I1009 07:50:22.075047 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:50:22 crc kubenswrapper[4715]: I1009 07:50:22.440009 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:50:22 crc kubenswrapper[4715]: I1009 07:50:22.442925 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:50:22 crc kubenswrapper[4715]: I1009 07:50:22.647774 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:50:22 crc kubenswrapper[4715]: I1009 07:50:22.647818 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:50:22 crc kubenswrapper[4715]: I1009 07:50:22.713192 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:50:23 crc kubenswrapper[4715]: I1009 07:50:23.399332 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7v8fg" podUID="856c1393-8838-4544-9769-a1055c252169" containerName="registry-server" containerID="cri-o://af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d" gracePeriod=2 Oct 09 07:50:23 crc kubenswrapper[4715]: I1009 07:50:23.458126 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:50:23 crc kubenswrapper[4715]: I1009 07:50:23.582317 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7f9jj"] Oct 09 07:50:23 crc kubenswrapper[4715]: I1009 07:50:23.582977 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7f9jj" podUID="fa4e27ff-70ea-4095-b26a-e787d60bf751" containerName="registry-server" containerID="cri-o://38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408" gracePeriod=2 Oct 09 07:50:23 crc kubenswrapper[4715]: I1009 07:50:23.970140 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.134382 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-968hh\" (UniqueName: \"kubernetes.io/projected/fa4e27ff-70ea-4095-b26a-e787d60bf751-kube-api-access-968hh\") pod \"fa4e27ff-70ea-4095-b26a-e787d60bf751\" (UID: \"fa4e27ff-70ea-4095-b26a-e787d60bf751\") " Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.134529 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4e27ff-70ea-4095-b26a-e787d60bf751-catalog-content\") pod \"fa4e27ff-70ea-4095-b26a-e787d60bf751\" (UID: \"fa4e27ff-70ea-4095-b26a-e787d60bf751\") " Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.134572 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4e27ff-70ea-4095-b26a-e787d60bf751-utilities\") pod \"fa4e27ff-70ea-4095-b26a-e787d60bf751\" (UID: \"fa4e27ff-70ea-4095-b26a-e787d60bf751\") " Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.135570 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa4e27ff-70ea-4095-b26a-e787d60bf751-utilities" (OuterVolumeSpecName: "utilities") pod "fa4e27ff-70ea-4095-b26a-e787d60bf751" (UID: "fa4e27ff-70ea-4095-b26a-e787d60bf751"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.142717 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4e27ff-70ea-4095-b26a-e787d60bf751-kube-api-access-968hh" (OuterVolumeSpecName: "kube-api-access-968hh") pod "fa4e27ff-70ea-4095-b26a-e787d60bf751" (UID: "fa4e27ff-70ea-4095-b26a-e787d60bf751"). InnerVolumeSpecName "kube-api-access-968hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.173793 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-76vrx"] Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.208879 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa4e27ff-70ea-4095-b26a-e787d60bf751-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa4e27ff-70ea-4095-b26a-e787d60bf751" (UID: "fa4e27ff-70ea-4095-b26a-e787d60bf751"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.236298 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa4e27ff-70ea-4095-b26a-e787d60bf751-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.236661 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa4e27ff-70ea-4095-b26a-e787d60bf751-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.236672 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-968hh\" (UniqueName: \"kubernetes.io/projected/fa4e27ff-70ea-4095-b26a-e787d60bf751-kube-api-access-968hh\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.307592 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.404853 4715 generic.go:334] "Generic (PLEG): container finished" podID="856c1393-8838-4544-9769-a1055c252169" containerID="af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d" exitCode=0 Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.404932 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7v8fg" event={"ID":"856c1393-8838-4544-9769-a1055c252169","Type":"ContainerDied","Data":"af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d"} Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.404937 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7v8fg" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.404965 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7v8fg" event={"ID":"856c1393-8838-4544-9769-a1055c252169","Type":"ContainerDied","Data":"37b2512def9399ffa9b26fe6d7d92f71f5a7c33ae649ebf59ce4e28389995932"} Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.405009 4715 scope.go:117] "RemoveContainer" containerID="af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.408730 4715 generic.go:334] "Generic (PLEG): container finished" podID="fa4e27ff-70ea-4095-b26a-e787d60bf751" containerID="38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408" exitCode=0 Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.408791 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7f9jj" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.408806 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f9jj" event={"ID":"fa4e27ff-70ea-4095-b26a-e787d60bf751","Type":"ContainerDied","Data":"38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408"} Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.408865 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f9jj" event={"ID":"fa4e27ff-70ea-4095-b26a-e787d60bf751","Type":"ContainerDied","Data":"90eb69fe03b364fa70cf270f631500412f60824b10660aa32e847ba1a29a5bce"} Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.409544 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-76vrx" podUID="aa605d89-538d-40f1-8aea-e397b0667ff9" containerName="registry-server" containerID="cri-o://8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7" gracePeriod=2 Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.422589 4715 scope.go:117] "RemoveContainer" containerID="2834732226a40c99e19f1df8abdf840c6bbfced2d34d1ee37d8891c1afd35207" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.439128 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/856c1393-8838-4544-9769-a1055c252169-catalog-content\") pod \"856c1393-8838-4544-9769-a1055c252169\" (UID: \"856c1393-8838-4544-9769-a1055c252169\") " Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.439198 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/856c1393-8838-4544-9769-a1055c252169-utilities\") pod \"856c1393-8838-4544-9769-a1055c252169\" (UID: \"856c1393-8838-4544-9769-a1055c252169\") " Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.439334 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvl7x\" (UniqueName: \"kubernetes.io/projected/856c1393-8838-4544-9769-a1055c252169-kube-api-access-qvl7x\") pod \"856c1393-8838-4544-9769-a1055c252169\" (UID: \"856c1393-8838-4544-9769-a1055c252169\") " Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.441508 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7f9jj"] Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.444590 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/856c1393-8838-4544-9769-a1055c252169-utilities" (OuterVolumeSpecName: "utilities") pod "856c1393-8838-4544-9769-a1055c252169" (UID: "856c1393-8838-4544-9769-a1055c252169"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.445802 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7f9jj"] Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.460935 4715 scope.go:117] "RemoveContainer" containerID="bc18cd608efd241915200069b1f390a8a628de5e97bbb807e06d3bfeb76f934b" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.467545 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/856c1393-8838-4544-9769-a1055c252169-kube-api-access-qvl7x" (OuterVolumeSpecName: "kube-api-access-qvl7x") pod "856c1393-8838-4544-9769-a1055c252169" (UID: "856c1393-8838-4544-9769-a1055c252169"). InnerVolumeSpecName "kube-api-access-qvl7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.474554 4715 scope.go:117] "RemoveContainer" containerID="af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d" Oct 09 07:50:24 crc kubenswrapper[4715]: E1009 07:50:24.474893 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d\": container with ID starting with af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d not found: ID does not exist" containerID="af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.474928 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d"} err="failed to get container status \"af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d\": rpc error: code = NotFound desc = could not find container \"af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d\": container with ID starting with af63b9d3208d9abc29204f468248de977ebd1e714906fa0f73e11c5356c3573d not found: ID does not exist" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.474955 4715 scope.go:117] "RemoveContainer" containerID="2834732226a40c99e19f1df8abdf840c6bbfced2d34d1ee37d8891c1afd35207" Oct 09 07:50:24 crc kubenswrapper[4715]: E1009 07:50:24.475342 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2834732226a40c99e19f1df8abdf840c6bbfced2d34d1ee37d8891c1afd35207\": container with ID starting with 2834732226a40c99e19f1df8abdf840c6bbfced2d34d1ee37d8891c1afd35207 not found: ID does not exist" containerID="2834732226a40c99e19f1df8abdf840c6bbfced2d34d1ee37d8891c1afd35207" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.475364 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2834732226a40c99e19f1df8abdf840c6bbfced2d34d1ee37d8891c1afd35207"} err="failed to get container status \"2834732226a40c99e19f1df8abdf840c6bbfced2d34d1ee37d8891c1afd35207\": rpc error: code = NotFound desc = could not find container \"2834732226a40c99e19f1df8abdf840c6bbfced2d34d1ee37d8891c1afd35207\": container with ID starting with 2834732226a40c99e19f1df8abdf840c6bbfced2d34d1ee37d8891c1afd35207 not found: ID does not exist" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.475379 4715 scope.go:117] "RemoveContainer" containerID="bc18cd608efd241915200069b1f390a8a628de5e97bbb807e06d3bfeb76f934b" Oct 09 07:50:24 crc kubenswrapper[4715]: E1009 07:50:24.475714 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc18cd608efd241915200069b1f390a8a628de5e97bbb807e06d3bfeb76f934b\": container with ID starting with bc18cd608efd241915200069b1f390a8a628de5e97bbb807e06d3bfeb76f934b not found: ID does not exist" containerID="bc18cd608efd241915200069b1f390a8a628de5e97bbb807e06d3bfeb76f934b" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.475737 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc18cd608efd241915200069b1f390a8a628de5e97bbb807e06d3bfeb76f934b"} err="failed to get container status \"bc18cd608efd241915200069b1f390a8a628de5e97bbb807e06d3bfeb76f934b\": rpc error: code = NotFound desc = could not find container \"bc18cd608efd241915200069b1f390a8a628de5e97bbb807e06d3bfeb76f934b\": container with ID starting with bc18cd608efd241915200069b1f390a8a628de5e97bbb807e06d3bfeb76f934b not found: ID does not exist" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.475753 4715 scope.go:117] "RemoveContainer" containerID="38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.490224 4715 scope.go:117] "RemoveContainer" containerID="70aad7d05991a3f0d0755418c97419a4bb46e4ecffe8c6a40a66816c1806b7e9" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.490340 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/856c1393-8838-4544-9769-a1055c252169-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "856c1393-8838-4544-9769-a1055c252169" (UID: "856c1393-8838-4544-9769-a1055c252169"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.511239 4715 scope.go:117] "RemoveContainer" containerID="30b37245d901143b13f79d14c73f6d653a1450d86730a4369a8e78d497021cd0" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.541501 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/856c1393-8838-4544-9769-a1055c252169-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.541527 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvl7x\" (UniqueName: \"kubernetes.io/projected/856c1393-8838-4544-9769-a1055c252169-kube-api-access-qvl7x\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.541539 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/856c1393-8838-4544-9769-a1055c252169-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.542714 4715 scope.go:117] "RemoveContainer" containerID="38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408" Oct 09 07:50:24 crc kubenswrapper[4715]: E1009 07:50:24.543043 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408\": container with ID starting with 38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408 not found: ID does not exist" containerID="38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.543071 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408"} err="failed to get container status \"38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408\": rpc error: code = NotFound desc = could not find container \"38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408\": container with ID starting with 38c1e184369902f83222a66403c7201080ae16da200c4d035149f5811c1a7408 not found: ID does not exist" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.543092 4715 scope.go:117] "RemoveContainer" containerID="70aad7d05991a3f0d0755418c97419a4bb46e4ecffe8c6a40a66816c1806b7e9" Oct 09 07:50:24 crc kubenswrapper[4715]: E1009 07:50:24.543303 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70aad7d05991a3f0d0755418c97419a4bb46e4ecffe8c6a40a66816c1806b7e9\": container with ID starting with 70aad7d05991a3f0d0755418c97419a4bb46e4ecffe8c6a40a66816c1806b7e9 not found: ID does not exist" containerID="70aad7d05991a3f0d0755418c97419a4bb46e4ecffe8c6a40a66816c1806b7e9" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.543324 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70aad7d05991a3f0d0755418c97419a4bb46e4ecffe8c6a40a66816c1806b7e9"} err="failed to get container status \"70aad7d05991a3f0d0755418c97419a4bb46e4ecffe8c6a40a66816c1806b7e9\": rpc error: code = NotFound desc = could not find container \"70aad7d05991a3f0d0755418c97419a4bb46e4ecffe8c6a40a66816c1806b7e9\": container with ID starting with 70aad7d05991a3f0d0755418c97419a4bb46e4ecffe8c6a40a66816c1806b7e9 not found: ID does not exist" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.543340 4715 scope.go:117] "RemoveContainer" containerID="30b37245d901143b13f79d14c73f6d653a1450d86730a4369a8e78d497021cd0" Oct 09 07:50:24 crc kubenswrapper[4715]: E1009 07:50:24.543605 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30b37245d901143b13f79d14c73f6d653a1450d86730a4369a8e78d497021cd0\": container with ID starting with 30b37245d901143b13f79d14c73f6d653a1450d86730a4369a8e78d497021cd0 not found: ID does not exist" containerID="30b37245d901143b13f79d14c73f6d653a1450d86730a4369a8e78d497021cd0" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.543628 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b37245d901143b13f79d14c73f6d653a1450d86730a4369a8e78d497021cd0"} err="failed to get container status \"30b37245d901143b13f79d14c73f6d653a1450d86730a4369a8e78d497021cd0\": rpc error: code = NotFound desc = could not find container \"30b37245d901143b13f79d14c73f6d653a1450d86730a4369a8e78d497021cd0\": container with ID starting with 30b37245d901143b13f79d14c73f6d653a1450d86730a4369a8e78d497021cd0 not found: ID does not exist" Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.738738 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7v8fg"] Oct 09 07:50:24 crc kubenswrapper[4715]: I1009 07:50:24.744786 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7v8fg"] Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.256992 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.419465 4715 generic.go:334] "Generic (PLEG): container finished" podID="aa605d89-538d-40f1-8aea-e397b0667ff9" containerID="8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7" exitCode=0 Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.419512 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76vrx" event={"ID":"aa605d89-538d-40f1-8aea-e397b0667ff9","Type":"ContainerDied","Data":"8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7"} Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.419543 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76vrx" event={"ID":"aa605d89-538d-40f1-8aea-e397b0667ff9","Type":"ContainerDied","Data":"24bfd38fd7078eb56f11a55aac23381f2bbc7a5397c01da2f14c6b28b4db894b"} Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.419549 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76vrx" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.419564 4715 scope.go:117] "RemoveContainer" containerID="8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.436781 4715 scope.go:117] "RemoveContainer" containerID="07e0edca48e6107bb47b8fb440e9999f0be140ef9c9ccbff3916e3bb9852f3dd" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.452706 4715 scope.go:117] "RemoveContainer" containerID="c9fda2d11e521a81d4fe6d308e5fc0cca944c8bfbf27acb5878e72c36f91a812" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.453877 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa605d89-538d-40f1-8aea-e397b0667ff9-utilities\") pod \"aa605d89-538d-40f1-8aea-e397b0667ff9\" (UID: \"aa605d89-538d-40f1-8aea-e397b0667ff9\") " Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.453987 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa605d89-538d-40f1-8aea-e397b0667ff9-catalog-content\") pod \"aa605d89-538d-40f1-8aea-e397b0667ff9\" (UID: \"aa605d89-538d-40f1-8aea-e397b0667ff9\") " Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.454172 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zdzk\" (UniqueName: \"kubernetes.io/projected/aa605d89-538d-40f1-8aea-e397b0667ff9-kube-api-access-4zdzk\") pod \"aa605d89-538d-40f1-8aea-e397b0667ff9\" (UID: \"aa605d89-538d-40f1-8aea-e397b0667ff9\") " Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.454794 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa605d89-538d-40f1-8aea-e397b0667ff9-utilities" (OuterVolumeSpecName: "utilities") pod "aa605d89-538d-40f1-8aea-e397b0667ff9" (UID: "aa605d89-538d-40f1-8aea-e397b0667ff9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.455283 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa605d89-538d-40f1-8aea-e397b0667ff9-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.460900 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa605d89-538d-40f1-8aea-e397b0667ff9-kube-api-access-4zdzk" (OuterVolumeSpecName: "kube-api-access-4zdzk") pod "aa605d89-538d-40f1-8aea-e397b0667ff9" (UID: "aa605d89-538d-40f1-8aea-e397b0667ff9"). InnerVolumeSpecName "kube-api-access-4zdzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.470571 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa605d89-538d-40f1-8aea-e397b0667ff9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa605d89-538d-40f1-8aea-e397b0667ff9" (UID: "aa605d89-538d-40f1-8aea-e397b0667ff9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.503996 4715 scope.go:117] "RemoveContainer" containerID="8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7" Oct 09 07:50:25 crc kubenswrapper[4715]: E1009 07:50:25.504704 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7\": container with ID starting with 8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7 not found: ID does not exist" containerID="8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.504749 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7"} err="failed to get container status \"8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7\": rpc error: code = NotFound desc = could not find container \"8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7\": container with ID starting with 8ae12f04584d818ecc24243b04998e63035a3c11a9a6bb8de62319a2db906ae7 not found: ID does not exist" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.504777 4715 scope.go:117] "RemoveContainer" containerID="07e0edca48e6107bb47b8fb440e9999f0be140ef9c9ccbff3916e3bb9852f3dd" Oct 09 07:50:25 crc kubenswrapper[4715]: E1009 07:50:25.505312 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07e0edca48e6107bb47b8fb440e9999f0be140ef9c9ccbff3916e3bb9852f3dd\": container with ID starting with 07e0edca48e6107bb47b8fb440e9999f0be140ef9c9ccbff3916e3bb9852f3dd not found: ID does not exist" containerID="07e0edca48e6107bb47b8fb440e9999f0be140ef9c9ccbff3916e3bb9852f3dd" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.505411 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07e0edca48e6107bb47b8fb440e9999f0be140ef9c9ccbff3916e3bb9852f3dd"} err="failed to get container status \"07e0edca48e6107bb47b8fb440e9999f0be140ef9c9ccbff3916e3bb9852f3dd\": rpc error: code = NotFound desc = could not find container \"07e0edca48e6107bb47b8fb440e9999f0be140ef9c9ccbff3916e3bb9852f3dd\": container with ID starting with 07e0edca48e6107bb47b8fb440e9999f0be140ef9c9ccbff3916e3bb9852f3dd not found: ID does not exist" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.505496 4715 scope.go:117] "RemoveContainer" containerID="c9fda2d11e521a81d4fe6d308e5fc0cca944c8bfbf27acb5878e72c36f91a812" Oct 09 07:50:25 crc kubenswrapper[4715]: E1009 07:50:25.505975 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9fda2d11e521a81d4fe6d308e5fc0cca944c8bfbf27acb5878e72c36f91a812\": container with ID starting with c9fda2d11e521a81d4fe6d308e5fc0cca944c8bfbf27acb5878e72c36f91a812 not found: ID does not exist" containerID="c9fda2d11e521a81d4fe6d308e5fc0cca944c8bfbf27acb5878e72c36f91a812" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.506034 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fda2d11e521a81d4fe6d308e5fc0cca944c8bfbf27acb5878e72c36f91a812"} err="failed to get container status \"c9fda2d11e521a81d4fe6d308e5fc0cca944c8bfbf27acb5878e72c36f91a812\": rpc error: code = NotFound desc = could not find container \"c9fda2d11e521a81d4fe6d308e5fc0cca944c8bfbf27acb5878e72c36f91a812\": container with ID starting with c9fda2d11e521a81d4fe6d308e5fc0cca944c8bfbf27acb5878e72c36f91a812 not found: ID does not exist" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.556378 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zdzk\" (UniqueName: \"kubernetes.io/projected/aa605d89-538d-40f1-8aea-e397b0667ff9-kube-api-access-4zdzk\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.556464 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa605d89-538d-40f1-8aea-e397b0667ff9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.758791 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-76vrx"] Oct 09 07:50:25 crc kubenswrapper[4715]: I1009 07:50:25.764026 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-76vrx"] Oct 09 07:50:26 crc kubenswrapper[4715]: I1009 07:50:26.149491 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="856c1393-8838-4544-9769-a1055c252169" path="/var/lib/kubelet/pods/856c1393-8838-4544-9769-a1055c252169/volumes" Oct 09 07:50:26 crc kubenswrapper[4715]: I1009 07:50:26.151053 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa605d89-538d-40f1-8aea-e397b0667ff9" path="/var/lib/kubelet/pods/aa605d89-538d-40f1-8aea-e397b0667ff9/volumes" Oct 09 07:50:26 crc kubenswrapper[4715]: I1009 07:50:26.152278 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa4e27ff-70ea-4095-b26a-e787d60bf751" path="/var/lib/kubelet/pods/fa4e27ff-70ea-4095-b26a-e787d60bf751/volumes" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.805717 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rl4r4"] Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806540 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9da3b25-553e-4eb7-83b3-f5a74f6a1320" containerName="pruner" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806554 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9da3b25-553e-4eb7-83b3-f5a74f6a1320" containerName="pruner" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806563 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa605d89-538d-40f1-8aea-e397b0667ff9" containerName="extract-content" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806569 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa605d89-538d-40f1-8aea-e397b0667ff9" containerName="extract-content" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806581 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4e27ff-70ea-4095-b26a-e787d60bf751" containerName="extract-content" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806588 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4e27ff-70ea-4095-b26a-e787d60bf751" containerName="extract-content" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806594 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa605d89-538d-40f1-8aea-e397b0667ff9" containerName="registry-server" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806605 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa605d89-538d-40f1-8aea-e397b0667ff9" containerName="registry-server" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806613 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfa926f-39ff-4fbf-8265-2ebf3bc796bb" containerName="pruner" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806619 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfa926f-39ff-4fbf-8265-2ebf3bc796bb" containerName="pruner" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806626 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4e27ff-70ea-4095-b26a-e787d60bf751" containerName="extract-utilities" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806633 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4e27ff-70ea-4095-b26a-e787d60bf751" containerName="extract-utilities" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806641 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856c1393-8838-4544-9769-a1055c252169" containerName="extract-content" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806648 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="856c1393-8838-4544-9769-a1055c252169" containerName="extract-content" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806660 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856c1393-8838-4544-9769-a1055c252169" containerName="registry-server" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806668 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="856c1393-8838-4544-9769-a1055c252169" containerName="registry-server" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806680 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" containerName="registry-server" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806687 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" containerName="registry-server" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806696 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4e27ff-70ea-4095-b26a-e787d60bf751" containerName="registry-server" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806703 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4e27ff-70ea-4095-b26a-e787d60bf751" containerName="registry-server" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806716 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" containerName="extract-content" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806724 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" containerName="extract-content" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806735 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856c1393-8838-4544-9769-a1055c252169" containerName="extract-utilities" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806742 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="856c1393-8838-4544-9769-a1055c252169" containerName="extract-utilities" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806750 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" containerName="extract-utilities" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806757 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" containerName="extract-utilities" Oct 09 07:50:49 crc kubenswrapper[4715]: E1009 07:50:49.806771 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa605d89-538d-40f1-8aea-e397b0667ff9" containerName="extract-utilities" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806777 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa605d89-538d-40f1-8aea-e397b0667ff9" containerName="extract-utilities" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806871 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa605d89-538d-40f1-8aea-e397b0667ff9" containerName="registry-server" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806880 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ea1b53-8025-4b37-9b58-a1a03ddbbfaa" containerName="registry-server" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806888 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa4e27ff-70ea-4095-b26a-e787d60bf751" containerName="registry-server" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806906 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfa926f-39ff-4fbf-8265-2ebf3bc796bb" containerName="pruner" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806916 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="856c1393-8838-4544-9769-a1055c252169" containerName="registry-server" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.806924 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9da3b25-553e-4eb7-83b3-f5a74f6a1320" containerName="pruner" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.807315 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.827764 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rl4r4"] Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.897937 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-bound-sa-token\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.898291 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxjjj\" (UniqueName: \"kubernetes.io/projected/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-kube-api-access-kxjjj\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.898443 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.898581 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.898701 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.898797 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-registry-tls\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.898878 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-registry-certificates\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.898956 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-trusted-ca\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:49 crc kubenswrapper[4715]: I1009 07:50:49.928228 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.000314 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-bound-sa-token\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.000725 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxjjj\" (UniqueName: \"kubernetes.io/projected/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-kube-api-access-kxjjj\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.000841 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.000939 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.001063 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-registry-tls\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.001146 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-registry-certificates\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.001229 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-trusted-ca\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.002645 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-trusted-ca\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.004197 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.004385 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-registry-certificates\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.014857 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.017038 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxjjj\" (UniqueName: \"kubernetes.io/projected/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-kube-api-access-kxjjj\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.017988 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-registry-tls\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.018568 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6c4c91b5-91f3-47ea-bc7d-591099f6e51a-bound-sa-token\") pod \"image-registry-66df7c8f76-rl4r4\" (UID: \"6c4c91b5-91f3-47ea-bc7d-591099f6e51a\") " pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.124187 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:50 crc kubenswrapper[4715]: I1009 07:50:50.552746 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rl4r4"] Oct 09 07:50:51 crc kubenswrapper[4715]: I1009 07:50:51.563231 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" event={"ID":"6c4c91b5-91f3-47ea-bc7d-591099f6e51a","Type":"ContainerStarted","Data":"546f25fc6218f38c5e6d2d6e5baa6c15522b87ddc477346bde6fa8738f7745e7"} Oct 09 07:50:51 crc kubenswrapper[4715]: I1009 07:50:51.564563 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" event={"ID":"6c4c91b5-91f3-47ea-bc7d-591099f6e51a","Type":"ContainerStarted","Data":"d60f875f1b5907a7748a04173dc1ed8fe70aa5c7ef7e8a2c270816fd15a60172"} Oct 09 07:50:51 crc kubenswrapper[4715]: I1009 07:50:51.564625 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:50:51 crc kubenswrapper[4715]: I1009 07:50:51.587432 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" podStartSLOduration=2.587388028 podStartE2EDuration="2.587388028s" podCreationTimestamp="2025-10-09 07:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:50:51.584803909 +0000 UTC m=+282.277607917" watchObservedRunningTime="2025-10-09 07:50:51.587388028 +0000 UTC m=+282.280192036" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.436648 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xqkhk"] Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.437610 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xqkhk" podUID="4e13129f-063f-400f-b483-537273d66d74" containerName="registry-server" containerID="cri-o://8d15dd648f2957c53eb0f27a4aee79622f359088743a80716816ee672033c9a3" gracePeriod=30 Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.448972 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6kpjv"] Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.449303 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6kpjv" podUID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" containerName="registry-server" containerID="cri-o://88ba48ad03f3927426b92e850bf8c38b6acfa5076feb46eb783cf7a6184b8b2f" gracePeriod=30 Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.467440 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hzbn9"] Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.467774 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" podUID="9040c858-1f6b-4900-b34e-c8b0b0c4c1ec" containerName="marketplace-operator" containerID="cri-o://8bc7e98303a37386f38b308d46945b4d4e2702ff4ae3a09783fb2ea83ef8a388" gracePeriod=30 Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.469269 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgjp2"] Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.474274 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vgjp2" podUID="fcafd988-9f86-4c66-8fa7-ead624b101a0" containerName="registry-server" containerID="cri-o://083d3650e43b7aa974b486d20c238ed42790835ba3d96cab2f028693f8def3e8" gracePeriod=30 Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.483431 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5njnt"] Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.483760 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5njnt" podUID="77b33330-3f7f-4bae-96c1-6558143109f2" containerName="registry-server" containerID="cri-o://35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c" gracePeriod=30 Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.487152 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rpvsg"] Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.488101 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.494788 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rpvsg"] Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.606514 4715 generic.go:334] "Generic (PLEG): container finished" podID="9040c858-1f6b-4900-b34e-c8b0b0c4c1ec" containerID="8bc7e98303a37386f38b308d46945b4d4e2702ff4ae3a09783fb2ea83ef8a388" exitCode=0 Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.606617 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" event={"ID":"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec","Type":"ContainerDied","Data":"8bc7e98303a37386f38b308d46945b4d4e2702ff4ae3a09783fb2ea83ef8a388"} Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.611863 4715 generic.go:334] "Generic (PLEG): container finished" podID="4e13129f-063f-400f-b483-537273d66d74" containerID="8d15dd648f2957c53eb0f27a4aee79622f359088743a80716816ee672033c9a3" exitCode=0 Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.611949 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xqkhk" event={"ID":"4e13129f-063f-400f-b483-537273d66d74","Type":"ContainerDied","Data":"8d15dd648f2957c53eb0f27a4aee79622f359088743a80716816ee672033c9a3"} Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.618735 4715 generic.go:334] "Generic (PLEG): container finished" podID="fcafd988-9f86-4c66-8fa7-ead624b101a0" containerID="083d3650e43b7aa974b486d20c238ed42790835ba3d96cab2f028693f8def3e8" exitCode=0 Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.618985 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgjp2" event={"ID":"fcafd988-9f86-4c66-8fa7-ead624b101a0","Type":"ContainerDied","Data":"083d3650e43b7aa974b486d20c238ed42790835ba3d96cab2f028693f8def3e8"} Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.623057 4715 generic.go:334] "Generic (PLEG): container finished" podID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" containerID="88ba48ad03f3927426b92e850bf8c38b6acfa5076feb46eb783cf7a6184b8b2f" exitCode=0 Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.623109 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kpjv" event={"ID":"8ba28cf2-dac1-47a3-9efe-727e793c7afd","Type":"ContainerDied","Data":"88ba48ad03f3927426b92e850bf8c38b6acfa5076feb46eb783cf7a6184b8b2f"} Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.637560 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06c9829f-1dca-4ef6-a34f-a5380dfd729c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rpvsg\" (UID: \"06c9829f-1dca-4ef6-a34f-a5380dfd729c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.637792 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/06c9829f-1dca-4ef6-a34f-a5380dfd729c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rpvsg\" (UID: \"06c9829f-1dca-4ef6-a34f-a5380dfd729c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.637941 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqtpp\" (UniqueName: \"kubernetes.io/projected/06c9829f-1dca-4ef6-a34f-a5380dfd729c-kube-api-access-kqtpp\") pod \"marketplace-operator-79b997595-rpvsg\" (UID: \"06c9829f-1dca-4ef6-a34f-a5380dfd729c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.741751 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/06c9829f-1dca-4ef6-a34f-a5380dfd729c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rpvsg\" (UID: \"06c9829f-1dca-4ef6-a34f-a5380dfd729c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.741814 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqtpp\" (UniqueName: \"kubernetes.io/projected/06c9829f-1dca-4ef6-a34f-a5380dfd729c-kube-api-access-kqtpp\") pod \"marketplace-operator-79b997595-rpvsg\" (UID: \"06c9829f-1dca-4ef6-a34f-a5380dfd729c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.741871 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06c9829f-1dca-4ef6-a34f-a5380dfd729c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rpvsg\" (UID: \"06c9829f-1dca-4ef6-a34f-a5380dfd729c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.744015 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06c9829f-1dca-4ef6-a34f-a5380dfd729c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rpvsg\" (UID: \"06c9829f-1dca-4ef6-a34f-a5380dfd729c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.755669 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/06c9829f-1dca-4ef6-a34f-a5380dfd729c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rpvsg\" (UID: \"06c9829f-1dca-4ef6-a34f-a5380dfd729c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.765796 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqtpp\" (UniqueName: \"kubernetes.io/projected/06c9829f-1dca-4ef6-a34f-a5380dfd729c-kube-api-access-kqtpp\") pod \"marketplace-operator-79b997595-rpvsg\" (UID: \"06c9829f-1dca-4ef6-a34f-a5380dfd729c\") " pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.812333 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.941883 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.947408 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:50:58 crc kubenswrapper[4715]: I1009 07:50:58.947909 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.005206 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.017935 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.045051 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rpvsg"] Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.048141 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wqvs\" (UniqueName: \"kubernetes.io/projected/fcafd988-9f86-4c66-8fa7-ead624b101a0-kube-api-access-7wqvs\") pod \"fcafd988-9f86-4c66-8fa7-ead624b101a0\" (UID: \"fcafd988-9f86-4c66-8fa7-ead624b101a0\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.048197 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e13129f-063f-400f-b483-537273d66d74-utilities\") pod \"4e13129f-063f-400f-b483-537273d66d74\" (UID: \"4e13129f-063f-400f-b483-537273d66d74\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.048317 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba28cf2-dac1-47a3-9efe-727e793c7afd-utilities\") pod \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\" (UID: \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.048348 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcafd988-9f86-4c66-8fa7-ead624b101a0-utilities\") pod \"fcafd988-9f86-4c66-8fa7-ead624b101a0\" (UID: \"fcafd988-9f86-4c66-8fa7-ead624b101a0\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.048368 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcafd988-9f86-4c66-8fa7-ead624b101a0-catalog-content\") pod \"fcafd988-9f86-4c66-8fa7-ead624b101a0\" (UID: \"fcafd988-9f86-4c66-8fa7-ead624b101a0\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.048388 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba28cf2-dac1-47a3-9efe-727e793c7afd-catalog-content\") pod \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\" (UID: \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.048467 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs9sm\" (UniqueName: \"kubernetes.io/projected/4e13129f-063f-400f-b483-537273d66d74-kube-api-access-vs9sm\") pod \"4e13129f-063f-400f-b483-537273d66d74\" (UID: \"4e13129f-063f-400f-b483-537273d66d74\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.048493 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5dfz\" (UniqueName: \"kubernetes.io/projected/8ba28cf2-dac1-47a3-9efe-727e793c7afd-kube-api-access-w5dfz\") pod \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\" (UID: \"8ba28cf2-dac1-47a3-9efe-727e793c7afd\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.048515 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e13129f-063f-400f-b483-537273d66d74-catalog-content\") pod \"4e13129f-063f-400f-b483-537273d66d74\" (UID: \"4e13129f-063f-400f-b483-537273d66d74\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.051307 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcafd988-9f86-4c66-8fa7-ead624b101a0-utilities" (OuterVolumeSpecName: "utilities") pod "fcafd988-9f86-4c66-8fa7-ead624b101a0" (UID: "fcafd988-9f86-4c66-8fa7-ead624b101a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.051601 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba28cf2-dac1-47a3-9efe-727e793c7afd-utilities" (OuterVolumeSpecName: "utilities") pod "8ba28cf2-dac1-47a3-9efe-727e793c7afd" (UID: "8ba28cf2-dac1-47a3-9efe-727e793c7afd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.054487 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcafd988-9f86-4c66-8fa7-ead624b101a0-kube-api-access-7wqvs" (OuterVolumeSpecName: "kube-api-access-7wqvs") pod "fcafd988-9f86-4c66-8fa7-ead624b101a0" (UID: "fcafd988-9f86-4c66-8fa7-ead624b101a0"). InnerVolumeSpecName "kube-api-access-7wqvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.057625 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba28cf2-dac1-47a3-9efe-727e793c7afd-kube-api-access-w5dfz" (OuterVolumeSpecName: "kube-api-access-w5dfz") pod "8ba28cf2-dac1-47a3-9efe-727e793c7afd" (UID: "8ba28cf2-dac1-47a3-9efe-727e793c7afd"). InnerVolumeSpecName "kube-api-access-w5dfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.059910 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e13129f-063f-400f-b483-537273d66d74-kube-api-access-vs9sm" (OuterVolumeSpecName: "kube-api-access-vs9sm") pod "4e13129f-063f-400f-b483-537273d66d74" (UID: "4e13129f-063f-400f-b483-537273d66d74"). InnerVolumeSpecName "kube-api-access-vs9sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.065072 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e13129f-063f-400f-b483-537273d66d74-utilities" (OuterVolumeSpecName: "utilities") pod "4e13129f-063f-400f-b483-537273d66d74" (UID: "4e13129f-063f-400f-b483-537273d66d74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.085962 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcafd988-9f86-4c66-8fa7-ead624b101a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcafd988-9f86-4c66-8fa7-ead624b101a0" (UID: "fcafd988-9f86-4c66-8fa7-ead624b101a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.110309 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba28cf2-dac1-47a3-9efe-727e793c7afd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ba28cf2-dac1-47a3-9efe-727e793c7afd" (UID: "8ba28cf2-dac1-47a3-9efe-727e793c7afd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.118374 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e13129f-063f-400f-b483-537273d66d74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e13129f-063f-400f-b483-537273d66d74" (UID: "4e13129f-063f-400f-b483-537273d66d74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.150932 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhv8c\" (UniqueName: \"kubernetes.io/projected/77b33330-3f7f-4bae-96c1-6558143109f2-kube-api-access-qhv8c\") pod \"77b33330-3f7f-4bae-96c1-6558143109f2\" (UID: \"77b33330-3f7f-4bae-96c1-6558143109f2\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.150991 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72t4q\" (UniqueName: \"kubernetes.io/projected/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-kube-api-access-72t4q\") pod \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\" (UID: \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151032 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-marketplace-trusted-ca\") pod \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\" (UID: \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151061 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-marketplace-operator-metrics\") pod \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\" (UID: \"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151107 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77b33330-3f7f-4bae-96c1-6558143109f2-catalog-content\") pod \"77b33330-3f7f-4bae-96c1-6558143109f2\" (UID: \"77b33330-3f7f-4bae-96c1-6558143109f2\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151147 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77b33330-3f7f-4bae-96c1-6558143109f2-utilities\") pod \"77b33330-3f7f-4bae-96c1-6558143109f2\" (UID: \"77b33330-3f7f-4bae-96c1-6558143109f2\") " Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151363 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e13129f-063f-400f-b483-537273d66d74-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151382 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba28cf2-dac1-47a3-9efe-727e793c7afd-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151391 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcafd988-9f86-4c66-8fa7-ead624b101a0-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151399 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcafd988-9f86-4c66-8fa7-ead624b101a0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151412 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba28cf2-dac1-47a3-9efe-727e793c7afd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151444 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs9sm\" (UniqueName: \"kubernetes.io/projected/4e13129f-063f-400f-b483-537273d66d74-kube-api-access-vs9sm\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151453 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5dfz\" (UniqueName: \"kubernetes.io/projected/8ba28cf2-dac1-47a3-9efe-727e793c7afd-kube-api-access-w5dfz\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151461 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e13129f-063f-400f-b483-537273d66d74-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.151469 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wqvs\" (UniqueName: \"kubernetes.io/projected/fcafd988-9f86-4c66-8fa7-ead624b101a0-kube-api-access-7wqvs\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.152203 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77b33330-3f7f-4bae-96c1-6558143109f2-utilities" (OuterVolumeSpecName: "utilities") pod "77b33330-3f7f-4bae-96c1-6558143109f2" (UID: "77b33330-3f7f-4bae-96c1-6558143109f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.152339 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "9040c858-1f6b-4900-b34e-c8b0b0c4c1ec" (UID: "9040c858-1f6b-4900-b34e-c8b0b0c4c1ec"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.155972 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "9040c858-1f6b-4900-b34e-c8b0b0c4c1ec" (UID: "9040c858-1f6b-4900-b34e-c8b0b0c4c1ec"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.156369 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77b33330-3f7f-4bae-96c1-6558143109f2-kube-api-access-qhv8c" (OuterVolumeSpecName: "kube-api-access-qhv8c") pod "77b33330-3f7f-4bae-96c1-6558143109f2" (UID: "77b33330-3f7f-4bae-96c1-6558143109f2"). InnerVolumeSpecName "kube-api-access-qhv8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.157480 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-kube-api-access-72t4q" (OuterVolumeSpecName: "kube-api-access-72t4q") pod "9040c858-1f6b-4900-b34e-c8b0b0c4c1ec" (UID: "9040c858-1f6b-4900-b34e-c8b0b0c4c1ec"). InnerVolumeSpecName "kube-api-access-72t4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.243101 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77b33330-3f7f-4bae-96c1-6558143109f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77b33330-3f7f-4bae-96c1-6558143109f2" (UID: "77b33330-3f7f-4bae-96c1-6558143109f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.253285 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhv8c\" (UniqueName: \"kubernetes.io/projected/77b33330-3f7f-4bae-96c1-6558143109f2-kube-api-access-qhv8c\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.253327 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72t4q\" (UniqueName: \"kubernetes.io/projected/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-kube-api-access-72t4q\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.253343 4715 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.253355 4715 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.253369 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77b33330-3f7f-4bae-96c1-6558143109f2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.253381 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77b33330-3f7f-4bae-96c1-6558143109f2-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.629676 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgjp2" event={"ID":"fcafd988-9f86-4c66-8fa7-ead624b101a0","Type":"ContainerDied","Data":"3c92cb8fabe6c9319f79637e4a239ac745b9de0618e35e41618c6601dfe94f7f"} Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.629702 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgjp2" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.630028 4715 scope.go:117] "RemoveContainer" containerID="083d3650e43b7aa974b486d20c238ed42790835ba3d96cab2f028693f8def3e8" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.631157 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" event={"ID":"06c9829f-1dca-4ef6-a34f-a5380dfd729c","Type":"ContainerStarted","Data":"bebcc8cea90ab76da3d51c6ae43591e7b93e58bb44991ffa2dc964a9c59c6302"} Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.631198 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" event={"ID":"06c9829f-1dca-4ef6-a34f-a5380dfd729c","Type":"ContainerStarted","Data":"d401a5463f372fb5ac33d580997fef6978cef9f95cfc52942a363bbea714bc54"} Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.631869 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.635717 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6kpjv" event={"ID":"8ba28cf2-dac1-47a3-9efe-727e793c7afd","Type":"ContainerDied","Data":"100b2666857442dca3fa066d82574c5817fbfd95da32bafc9b64d23152a16224"} Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.635788 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6kpjv" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.638178 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.642186 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" event={"ID":"9040c858-1f6b-4900-b34e-c8b0b0c4c1ec","Type":"ContainerDied","Data":"c8339ebe5a987752bad6b497baef860041697998111487689f00920b5a239b8f"} Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.642231 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hzbn9" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.646365 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xqkhk" event={"ID":"4e13129f-063f-400f-b483-537273d66d74","Type":"ContainerDied","Data":"4414765f584e593c671cb30ecfb9bc8ccc3419ffd34d557357de41b5bd296a57"} Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.646387 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xqkhk" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.651834 4715 generic.go:334] "Generic (PLEG): container finished" podID="77b33330-3f7f-4bae-96c1-6558143109f2" containerID="35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c" exitCode=0 Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.651909 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5njnt" event={"ID":"77b33330-3f7f-4bae-96c1-6558143109f2","Type":"ContainerDied","Data":"35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c"} Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.651943 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5njnt" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.651951 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5njnt" event={"ID":"77b33330-3f7f-4bae-96c1-6558143109f2","Type":"ContainerDied","Data":"a9025a13547ce8d208f04323eacdb28ee8c4459fe15a2bcf3c079a6030908717"} Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.658099 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rpvsg" podStartSLOduration=1.658045178 podStartE2EDuration="1.658045178s" podCreationTimestamp="2025-10-09 07:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:50:59.651463376 +0000 UTC m=+290.344267384" watchObservedRunningTime="2025-10-09 07:50:59.658045178 +0000 UTC m=+290.350849186" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.658171 4715 scope.go:117] "RemoveContainer" containerID="3c2a0dd3159768b16ad47f45543bf3ff2111c3881c561ca93d6f414ec3d85096" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.675387 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgjp2"] Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.685719 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgjp2"] Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.699455 4715 scope.go:117] "RemoveContainer" containerID="8d0bcc910edbf39872066b8e4cbd48854839d61c031d0d6563cdca4128e80b0b" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.764924 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xqkhk"] Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.767588 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xqkhk"] Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.775096 4715 scope.go:117] "RemoveContainer" containerID="88ba48ad03f3927426b92e850bf8c38b6acfa5076feb46eb783cf7a6184b8b2f" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.785327 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6kpjv"] Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.791892 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6kpjv"] Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.795634 4715 scope.go:117] "RemoveContainer" containerID="85ced30e17aadf4d7bfbd0d2948c385586622483cbc166902b93f3374a478114" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.799972 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5njnt"] Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.803318 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5njnt"] Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.815471 4715 scope.go:117] "RemoveContainer" containerID="ca226250f6e8d32c2d578041e357084df7d6f07d373c89e1e85f8c8886ed1061" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.822225 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hzbn9"] Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.824553 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hzbn9"] Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.828674 4715 scope.go:117] "RemoveContainer" containerID="8bc7e98303a37386f38b308d46945b4d4e2702ff4ae3a09783fb2ea83ef8a388" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.861432 4715 scope.go:117] "RemoveContainer" containerID="8d15dd648f2957c53eb0f27a4aee79622f359088743a80716816ee672033c9a3" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.878532 4715 scope.go:117] "RemoveContainer" containerID="82de1add9382f5967971a4d58b3b56863a31d7aa54143cc80fb657399f530fa1" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.897386 4715 scope.go:117] "RemoveContainer" containerID="2035ba782adda00452929c08cd4d85be1179f0fc7ffbe7d0882cc125801e87a0" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.912897 4715 scope.go:117] "RemoveContainer" containerID="35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.924131 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q5ck7"] Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.931135 4715 scope.go:117] "RemoveContainer" containerID="dce4bd73a6b71b68dfe32b20d46ef06afc024e40d582aa447f26c7d21c37af2c" Oct 09 07:50:59 crc kubenswrapper[4715]: I1009 07:50:59.979316 4715 scope.go:117] "RemoveContainer" containerID="e24f7b935f43b984724c31319f2feed9313d37aee23fafe6b7c2e7d44c118b66" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.012648 4715 scope.go:117] "RemoveContainer" containerID="35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.015060 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c\": container with ID starting with 35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c not found: ID does not exist" containerID="35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.015110 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c"} err="failed to get container status \"35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c\": rpc error: code = NotFound desc = could not find container \"35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c\": container with ID starting with 35b065ce82f1810b009f356ebe77a8c2532fc494e9912891bfae08163e3a0d5c not found: ID does not exist" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.015141 4715 scope.go:117] "RemoveContainer" containerID="dce4bd73a6b71b68dfe32b20d46ef06afc024e40d582aa447f26c7d21c37af2c" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.015437 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce4bd73a6b71b68dfe32b20d46ef06afc024e40d582aa447f26c7d21c37af2c\": container with ID starting with dce4bd73a6b71b68dfe32b20d46ef06afc024e40d582aa447f26c7d21c37af2c not found: ID does not exist" containerID="dce4bd73a6b71b68dfe32b20d46ef06afc024e40d582aa447f26c7d21c37af2c" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.015470 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce4bd73a6b71b68dfe32b20d46ef06afc024e40d582aa447f26c7d21c37af2c"} err="failed to get container status \"dce4bd73a6b71b68dfe32b20d46ef06afc024e40d582aa447f26c7d21c37af2c\": rpc error: code = NotFound desc = could not find container \"dce4bd73a6b71b68dfe32b20d46ef06afc024e40d582aa447f26c7d21c37af2c\": container with ID starting with dce4bd73a6b71b68dfe32b20d46ef06afc024e40d582aa447f26c7d21c37af2c not found: ID does not exist" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.015486 4715 scope.go:117] "RemoveContainer" containerID="e24f7b935f43b984724c31319f2feed9313d37aee23fafe6b7c2e7d44c118b66" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.015921 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e24f7b935f43b984724c31319f2feed9313d37aee23fafe6b7c2e7d44c118b66\": container with ID starting with e24f7b935f43b984724c31319f2feed9313d37aee23fafe6b7c2e7d44c118b66 not found: ID does not exist" containerID="e24f7b935f43b984724c31319f2feed9313d37aee23fafe6b7c2e7d44c118b66" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.015971 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e24f7b935f43b984724c31319f2feed9313d37aee23fafe6b7c2e7d44c118b66"} err="failed to get container status \"e24f7b935f43b984724c31319f2feed9313d37aee23fafe6b7c2e7d44c118b66\": rpc error: code = NotFound desc = could not find container \"e24f7b935f43b984724c31319f2feed9313d37aee23fafe6b7c2e7d44c118b66\": container with ID starting with e24f7b935f43b984724c31319f2feed9313d37aee23fafe6b7c2e7d44c118b66 not found: ID does not exist" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.152355 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e13129f-063f-400f-b483-537273d66d74" path="/var/lib/kubelet/pods/4e13129f-063f-400f-b483-537273d66d74/volumes" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.153053 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77b33330-3f7f-4bae-96c1-6558143109f2" path="/var/lib/kubelet/pods/77b33330-3f7f-4bae-96c1-6558143109f2/volumes" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.153726 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" path="/var/lib/kubelet/pods/8ba28cf2-dac1-47a3-9efe-727e793c7afd/volumes" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.154938 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9040c858-1f6b-4900-b34e-c8b0b0c4c1ec" path="/var/lib/kubelet/pods/9040c858-1f6b-4900-b34e-c8b0b0c4c1ec/volumes" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.155466 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcafd988-9f86-4c66-8fa7-ead624b101a0" path="/var/lib/kubelet/pods/fcafd988-9f86-4c66-8fa7-ead624b101a0/volumes" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.664618 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9vpkg"] Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665148 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcafd988-9f86-4c66-8fa7-ead624b101a0" containerName="registry-server" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665162 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcafd988-9f86-4c66-8fa7-ead624b101a0" containerName="registry-server" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665175 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e13129f-063f-400f-b483-537273d66d74" containerName="registry-server" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665188 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e13129f-063f-400f-b483-537273d66d74" containerName="registry-server" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665201 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77b33330-3f7f-4bae-96c1-6558143109f2" containerName="extract-content" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665210 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="77b33330-3f7f-4bae-96c1-6558143109f2" containerName="extract-content" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665223 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e13129f-063f-400f-b483-537273d66d74" containerName="extract-utilities" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665231 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e13129f-063f-400f-b483-537273d66d74" containerName="extract-utilities" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665240 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77b33330-3f7f-4bae-96c1-6558143109f2" containerName="registry-server" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665246 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="77b33330-3f7f-4bae-96c1-6558143109f2" containerName="registry-server" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665257 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9040c858-1f6b-4900-b34e-c8b0b0c4c1ec" containerName="marketplace-operator" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665263 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9040c858-1f6b-4900-b34e-c8b0b0c4c1ec" containerName="marketplace-operator" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665270 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" containerName="registry-server" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665276 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" containerName="registry-server" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665285 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcafd988-9f86-4c66-8fa7-ead624b101a0" containerName="extract-content" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665290 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcafd988-9f86-4c66-8fa7-ead624b101a0" containerName="extract-content" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665300 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" containerName="extract-content" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665305 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" containerName="extract-content" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665313 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77b33330-3f7f-4bae-96c1-6558143109f2" containerName="extract-utilities" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665319 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="77b33330-3f7f-4bae-96c1-6558143109f2" containerName="extract-utilities" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665326 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcafd988-9f86-4c66-8fa7-ead624b101a0" containerName="extract-utilities" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665332 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcafd988-9f86-4c66-8fa7-ead624b101a0" containerName="extract-utilities" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665338 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e13129f-063f-400f-b483-537273d66d74" containerName="extract-content" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665344 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e13129f-063f-400f-b483-537273d66d74" containerName="extract-content" Oct 09 07:51:00 crc kubenswrapper[4715]: E1009 07:51:00.665352 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" containerName="extract-utilities" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665358 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" containerName="extract-utilities" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665527 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9040c858-1f6b-4900-b34e-c8b0b0c4c1ec" containerName="marketplace-operator" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665540 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcafd988-9f86-4c66-8fa7-ead624b101a0" containerName="registry-server" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665548 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e13129f-063f-400f-b483-537273d66d74" containerName="registry-server" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665560 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="77b33330-3f7f-4bae-96c1-6558143109f2" containerName="registry-server" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.665568 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba28cf2-dac1-47a3-9efe-727e793c7afd" containerName="registry-server" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.666306 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.668989 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.677893 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vpkg"] Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.784701 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a05949-7b09-4412-a6ae-004009c0c4bf-catalog-content\") pod \"redhat-marketplace-9vpkg\" (UID: \"26a05949-7b09-4412-a6ae-004009c0c4bf\") " pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.784755 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wzqn\" (UniqueName: \"kubernetes.io/projected/26a05949-7b09-4412-a6ae-004009c0c4bf-kube-api-access-4wzqn\") pod \"redhat-marketplace-9vpkg\" (UID: \"26a05949-7b09-4412-a6ae-004009c0c4bf\") " pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.784782 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a05949-7b09-4412-a6ae-004009c0c4bf-utilities\") pod \"redhat-marketplace-9vpkg\" (UID: \"26a05949-7b09-4412-a6ae-004009c0c4bf\") " pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.858445 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6j97m"] Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.859572 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.861358 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.866869 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6j97m"] Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.885966 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a05949-7b09-4412-a6ae-004009c0c4bf-utilities\") pod \"redhat-marketplace-9vpkg\" (UID: \"26a05949-7b09-4412-a6ae-004009c0c4bf\") " pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.886048 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a05949-7b09-4412-a6ae-004009c0c4bf-catalog-content\") pod \"redhat-marketplace-9vpkg\" (UID: \"26a05949-7b09-4412-a6ae-004009c0c4bf\") " pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.886071 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wzqn\" (UniqueName: \"kubernetes.io/projected/26a05949-7b09-4412-a6ae-004009c0c4bf-kube-api-access-4wzqn\") pod \"redhat-marketplace-9vpkg\" (UID: \"26a05949-7b09-4412-a6ae-004009c0c4bf\") " pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.886731 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a05949-7b09-4412-a6ae-004009c0c4bf-utilities\") pod \"redhat-marketplace-9vpkg\" (UID: \"26a05949-7b09-4412-a6ae-004009c0c4bf\") " pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.886937 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a05949-7b09-4412-a6ae-004009c0c4bf-catalog-content\") pod \"redhat-marketplace-9vpkg\" (UID: \"26a05949-7b09-4412-a6ae-004009c0c4bf\") " pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.914351 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wzqn\" (UniqueName: \"kubernetes.io/projected/26a05949-7b09-4412-a6ae-004009c0c4bf-kube-api-access-4wzqn\") pod \"redhat-marketplace-9vpkg\" (UID: \"26a05949-7b09-4412-a6ae-004009c0c4bf\") " pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.987004 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8e4f51-54cf-4545-8a89-8ccaf52c55fc-utilities\") pod \"community-operators-6j97m\" (UID: \"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc\") " pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.987060 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c2fr\" (UniqueName: \"kubernetes.io/projected/1c8e4f51-54cf-4545-8a89-8ccaf52c55fc-kube-api-access-5c2fr\") pod \"community-operators-6j97m\" (UID: \"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc\") " pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.987069 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:00 crc kubenswrapper[4715]: I1009 07:51:00.987099 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8e4f51-54cf-4545-8a89-8ccaf52c55fc-catalog-content\") pod \"community-operators-6j97m\" (UID: \"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc\") " pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.089387 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8e4f51-54cf-4545-8a89-8ccaf52c55fc-utilities\") pod \"community-operators-6j97m\" (UID: \"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc\") " pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.089966 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c2fr\" (UniqueName: \"kubernetes.io/projected/1c8e4f51-54cf-4545-8a89-8ccaf52c55fc-kube-api-access-5c2fr\") pod \"community-operators-6j97m\" (UID: \"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc\") " pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.090005 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8e4f51-54cf-4545-8a89-8ccaf52c55fc-catalog-content\") pod \"community-operators-6j97m\" (UID: \"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc\") " pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.090346 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8e4f51-54cf-4545-8a89-8ccaf52c55fc-utilities\") pod \"community-operators-6j97m\" (UID: \"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc\") " pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.090833 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8e4f51-54cf-4545-8a89-8ccaf52c55fc-catalog-content\") pod \"community-operators-6j97m\" (UID: \"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc\") " pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.112248 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c2fr\" (UniqueName: \"kubernetes.io/projected/1c8e4f51-54cf-4545-8a89-8ccaf52c55fc-kube-api-access-5c2fr\") pod \"community-operators-6j97m\" (UID: \"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc\") " pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.205907 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.383718 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6j97m"] Oct 09 07:51:01 crc kubenswrapper[4715]: W1009 07:51:01.395373 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c8e4f51_54cf_4545_8a89_8ccaf52c55fc.slice/crio-7116c5454446a17c6aaa39f2c2b56bd55bfbad467990125b083fba6b5cff7d66 WatchSource:0}: Error finding container 7116c5454446a17c6aaa39f2c2b56bd55bfbad467990125b083fba6b5cff7d66: Status 404 returned error can't find the container with id 7116c5454446a17c6aaa39f2c2b56bd55bfbad467990125b083fba6b5cff7d66 Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.409726 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vpkg"] Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.691320 4715 generic.go:334] "Generic (PLEG): container finished" podID="1c8e4f51-54cf-4545-8a89-8ccaf52c55fc" containerID="d9f05b82877e8b873eac609ef086647d7f9266df3f4a0fba3b278b53387e0daa" exitCode=0 Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.691450 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6j97m" event={"ID":"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc","Type":"ContainerDied","Data":"d9f05b82877e8b873eac609ef086647d7f9266df3f4a0fba3b278b53387e0daa"} Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.691879 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6j97m" event={"ID":"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc","Type":"ContainerStarted","Data":"7116c5454446a17c6aaa39f2c2b56bd55bfbad467990125b083fba6b5cff7d66"} Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.694898 4715 generic.go:334] "Generic (PLEG): container finished" podID="26a05949-7b09-4412-a6ae-004009c0c4bf" containerID="a61b371e7e68585ab1231a850ea1bd5cba60210c353a23f2fb860e2e28915057" exitCode=0 Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.696631 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vpkg" event={"ID":"26a05949-7b09-4412-a6ae-004009c0c4bf","Type":"ContainerDied","Data":"a61b371e7e68585ab1231a850ea1bd5cba60210c353a23f2fb860e2e28915057"} Oct 09 07:51:01 crc kubenswrapper[4715]: I1009 07:51:01.696671 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vpkg" event={"ID":"26a05949-7b09-4412-a6ae-004009c0c4bf","Type":"ContainerStarted","Data":"e0c9890727536df2e55f59f7688bcb85c2eef777d06ea5eef02f3fb34dd93bf7"} Oct 09 07:51:02 crc kubenswrapper[4715]: I1009 07:51:02.706080 4715 generic.go:334] "Generic (PLEG): container finished" podID="26a05949-7b09-4412-a6ae-004009c0c4bf" containerID="ada271034126f1d054b58163b0c9dabf4999f5a85af93bd8c32a8494a72ec5a4" exitCode=0 Oct 09 07:51:02 crc kubenswrapper[4715]: I1009 07:51:02.706384 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vpkg" event={"ID":"26a05949-7b09-4412-a6ae-004009c0c4bf","Type":"ContainerDied","Data":"ada271034126f1d054b58163b0c9dabf4999f5a85af93bd8c32a8494a72ec5a4"} Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.058635 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g5kq2"] Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.059899 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.061824 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.072918 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g5kq2"] Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.231521 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr2tk\" (UniqueName: \"kubernetes.io/projected/efd04f5e-635d-422b-ae2a-38096e0ecc44-kube-api-access-mr2tk\") pod \"redhat-operators-g5kq2\" (UID: \"efd04f5e-635d-422b-ae2a-38096e0ecc44\") " pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.231898 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd04f5e-635d-422b-ae2a-38096e0ecc44-catalog-content\") pod \"redhat-operators-g5kq2\" (UID: \"efd04f5e-635d-422b-ae2a-38096e0ecc44\") " pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.231925 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd04f5e-635d-422b-ae2a-38096e0ecc44-utilities\") pod \"redhat-operators-g5kq2\" (UID: \"efd04f5e-635d-422b-ae2a-38096e0ecc44\") " pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.256002 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-86kz5"] Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.257021 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.262099 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.269451 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-86kz5"] Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.332645 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr2tk\" (UniqueName: \"kubernetes.io/projected/efd04f5e-635d-422b-ae2a-38096e0ecc44-kube-api-access-mr2tk\") pod \"redhat-operators-g5kq2\" (UID: \"efd04f5e-635d-422b-ae2a-38096e0ecc44\") " pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.332702 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd04f5e-635d-422b-ae2a-38096e0ecc44-catalog-content\") pod \"redhat-operators-g5kq2\" (UID: \"efd04f5e-635d-422b-ae2a-38096e0ecc44\") " pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.332738 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd04f5e-635d-422b-ae2a-38096e0ecc44-utilities\") pod \"redhat-operators-g5kq2\" (UID: \"efd04f5e-635d-422b-ae2a-38096e0ecc44\") " pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.333767 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd04f5e-635d-422b-ae2a-38096e0ecc44-utilities\") pod \"redhat-operators-g5kq2\" (UID: \"efd04f5e-635d-422b-ae2a-38096e0ecc44\") " pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.334154 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd04f5e-635d-422b-ae2a-38096e0ecc44-catalog-content\") pod \"redhat-operators-g5kq2\" (UID: \"efd04f5e-635d-422b-ae2a-38096e0ecc44\") " pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.354868 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr2tk\" (UniqueName: \"kubernetes.io/projected/efd04f5e-635d-422b-ae2a-38096e0ecc44-kube-api-access-mr2tk\") pod \"redhat-operators-g5kq2\" (UID: \"efd04f5e-635d-422b-ae2a-38096e0ecc44\") " pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.387911 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.434662 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944f4e67-23a1-4024-a5c2-180f17dea29c-utilities\") pod \"certified-operators-86kz5\" (UID: \"944f4e67-23a1-4024-a5c2-180f17dea29c\") " pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.434725 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr5ks\" (UniqueName: \"kubernetes.io/projected/944f4e67-23a1-4024-a5c2-180f17dea29c-kube-api-access-lr5ks\") pod \"certified-operators-86kz5\" (UID: \"944f4e67-23a1-4024-a5c2-180f17dea29c\") " pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.434812 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944f4e67-23a1-4024-a5c2-180f17dea29c-catalog-content\") pod \"certified-operators-86kz5\" (UID: \"944f4e67-23a1-4024-a5c2-180f17dea29c\") " pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.536575 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944f4e67-23a1-4024-a5c2-180f17dea29c-catalog-content\") pod \"certified-operators-86kz5\" (UID: \"944f4e67-23a1-4024-a5c2-180f17dea29c\") " pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.537073 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944f4e67-23a1-4024-a5c2-180f17dea29c-utilities\") pod \"certified-operators-86kz5\" (UID: \"944f4e67-23a1-4024-a5c2-180f17dea29c\") " pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.537106 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr5ks\" (UniqueName: \"kubernetes.io/projected/944f4e67-23a1-4024-a5c2-180f17dea29c-kube-api-access-lr5ks\") pod \"certified-operators-86kz5\" (UID: \"944f4e67-23a1-4024-a5c2-180f17dea29c\") " pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.538340 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944f4e67-23a1-4024-a5c2-180f17dea29c-catalog-content\") pod \"certified-operators-86kz5\" (UID: \"944f4e67-23a1-4024-a5c2-180f17dea29c\") " pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.538615 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944f4e67-23a1-4024-a5c2-180f17dea29c-utilities\") pod \"certified-operators-86kz5\" (UID: \"944f4e67-23a1-4024-a5c2-180f17dea29c\") " pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.558288 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr5ks\" (UniqueName: \"kubernetes.io/projected/944f4e67-23a1-4024-a5c2-180f17dea29c-kube-api-access-lr5ks\") pod \"certified-operators-86kz5\" (UID: \"944f4e67-23a1-4024-a5c2-180f17dea29c\") " pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.608576 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g5kq2"] Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.632350 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.735158 4715 generic.go:334] "Generic (PLEG): container finished" podID="1c8e4f51-54cf-4545-8a89-8ccaf52c55fc" containerID="a93a1798efc23c7ed731e7b0d49af56fc72b8918f96a842b7be9ea6c5453c1fc" exitCode=0 Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.737301 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6j97m" event={"ID":"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc","Type":"ContainerDied","Data":"a93a1798efc23c7ed731e7b0d49af56fc72b8918f96a842b7be9ea6c5453c1fc"} Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.747014 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vpkg" event={"ID":"26a05949-7b09-4412-a6ae-004009c0c4bf","Type":"ContainerStarted","Data":"7d621fa3c48a28400ebc7000eb6c20469aa688a8f04129045485ea0428d65553"} Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.748983 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5kq2" event={"ID":"efd04f5e-635d-422b-ae2a-38096e0ecc44","Type":"ContainerStarted","Data":"a2bcee56b1a8f95e8de7ef34c00ef4a0a04cc080bafc810c13a5a7f4cd797835"} Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.799106 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9vpkg" podStartSLOduration=2.240626099 podStartE2EDuration="3.799076515s" podCreationTimestamp="2025-10-09 07:51:00 +0000 UTC" firstStartedPulling="2025-10-09 07:51:01.698297153 +0000 UTC m=+292.391101161" lastFinishedPulling="2025-10-09 07:51:03.256747569 +0000 UTC m=+293.949551577" observedRunningTime="2025-10-09 07:51:03.794484734 +0000 UTC m=+294.487288742" watchObservedRunningTime="2025-10-09 07:51:03.799076515 +0000 UTC m=+294.491880523" Oct 09 07:51:03 crc kubenswrapper[4715]: I1009 07:51:03.872709 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-86kz5"] Oct 09 07:51:03 crc kubenswrapper[4715]: W1009 07:51:03.887737 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice/crio-2ec7b4979f4f0f1c4ba2801b0c36dc86bcd3db9b8bf16d1a62d1597e283d2c72 WatchSource:0}: Error finding container 2ec7b4979f4f0f1c4ba2801b0c36dc86bcd3db9b8bf16d1a62d1597e283d2c72: Status 404 returned error can't find the container with id 2ec7b4979f4f0f1c4ba2801b0c36dc86bcd3db9b8bf16d1a62d1597e283d2c72 Oct 09 07:51:04 crc kubenswrapper[4715]: I1009 07:51:04.757795 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6j97m" event={"ID":"1c8e4f51-54cf-4545-8a89-8ccaf52c55fc","Type":"ContainerStarted","Data":"b3d8ad4835baf1af8b4d766c63e500089c218c2e67d9a3abdda9cefbce2e1be8"} Oct 09 07:51:04 crc kubenswrapper[4715]: I1009 07:51:04.759207 4715 generic.go:334] "Generic (PLEG): container finished" podID="944f4e67-23a1-4024-a5c2-180f17dea29c" containerID="ecaa3d2f8b3604e4635ce0ac9c8517076073357ae4d0b4dabfdae9c4ad8eda60" exitCode=0 Oct 09 07:51:04 crc kubenswrapper[4715]: I1009 07:51:04.759268 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86kz5" event={"ID":"944f4e67-23a1-4024-a5c2-180f17dea29c","Type":"ContainerDied","Data":"ecaa3d2f8b3604e4635ce0ac9c8517076073357ae4d0b4dabfdae9c4ad8eda60"} Oct 09 07:51:04 crc kubenswrapper[4715]: I1009 07:51:04.759286 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86kz5" event={"ID":"944f4e67-23a1-4024-a5c2-180f17dea29c","Type":"ContainerStarted","Data":"2ec7b4979f4f0f1c4ba2801b0c36dc86bcd3db9b8bf16d1a62d1597e283d2c72"} Oct 09 07:51:04 crc kubenswrapper[4715]: I1009 07:51:04.760661 4715 generic.go:334] "Generic (PLEG): container finished" podID="efd04f5e-635d-422b-ae2a-38096e0ecc44" containerID="3f27260f19f3329cc1b1e2b3610ac836166816aebddd2cccb756b8c9db8bcef3" exitCode=0 Oct 09 07:51:04 crc kubenswrapper[4715]: I1009 07:51:04.760736 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5kq2" event={"ID":"efd04f5e-635d-422b-ae2a-38096e0ecc44","Type":"ContainerDied","Data":"3f27260f19f3329cc1b1e2b3610ac836166816aebddd2cccb756b8c9db8bcef3"} Oct 09 07:51:04 crc kubenswrapper[4715]: I1009 07:51:04.775784 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6j97m" podStartSLOduration=2.186171129 podStartE2EDuration="4.775756835s" podCreationTimestamp="2025-10-09 07:51:00 +0000 UTC" firstStartedPulling="2025-10-09 07:51:01.693777334 +0000 UTC m=+292.386581342" lastFinishedPulling="2025-10-09 07:51:04.28336304 +0000 UTC m=+294.976167048" observedRunningTime="2025-10-09 07:51:04.774017361 +0000 UTC m=+295.466821369" watchObservedRunningTime="2025-10-09 07:51:04.775756835 +0000 UTC m=+295.468560843" Oct 09 07:51:07 crc kubenswrapper[4715]: I1009 07:51:07.783311 4715 generic.go:334] "Generic (PLEG): container finished" podID="944f4e67-23a1-4024-a5c2-180f17dea29c" containerID="a0ef7662c49030a85f868b105fa072cbe092e359f381ff693d403884cdd9bfb1" exitCode=0 Oct 09 07:51:07 crc kubenswrapper[4715]: I1009 07:51:07.783980 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86kz5" event={"ID":"944f4e67-23a1-4024-a5c2-180f17dea29c","Type":"ContainerDied","Data":"a0ef7662c49030a85f868b105fa072cbe092e359f381ff693d403884cdd9bfb1"} Oct 09 07:51:07 crc kubenswrapper[4715]: I1009 07:51:07.786986 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5kq2" event={"ID":"efd04f5e-635d-422b-ae2a-38096e0ecc44","Type":"ContainerStarted","Data":"e4055b2ec15832224dfe5a0f4f5d25e586facec07bfd4056288af7ba7338c4ae"} Oct 09 07:51:08 crc kubenswrapper[4715]: I1009 07:51:08.803130 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86kz5" event={"ID":"944f4e67-23a1-4024-a5c2-180f17dea29c","Type":"ContainerStarted","Data":"0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e"} Oct 09 07:51:08 crc kubenswrapper[4715]: I1009 07:51:08.805348 4715 generic.go:334] "Generic (PLEG): container finished" podID="efd04f5e-635d-422b-ae2a-38096e0ecc44" containerID="e4055b2ec15832224dfe5a0f4f5d25e586facec07bfd4056288af7ba7338c4ae" exitCode=0 Oct 09 07:51:08 crc kubenswrapper[4715]: I1009 07:51:08.805378 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5kq2" event={"ID":"efd04f5e-635d-422b-ae2a-38096e0ecc44","Type":"ContainerDied","Data":"e4055b2ec15832224dfe5a0f4f5d25e586facec07bfd4056288af7ba7338c4ae"} Oct 09 07:51:09 crc kubenswrapper[4715]: I1009 07:51:09.827954 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-86kz5" podStartSLOduration=2.966695617 podStartE2EDuration="6.827925392s" podCreationTimestamp="2025-10-09 07:51:03 +0000 UTC" firstStartedPulling="2025-10-09 07:51:04.761013552 +0000 UTC m=+295.453817560" lastFinishedPulling="2025-10-09 07:51:08.622243317 +0000 UTC m=+299.315047335" observedRunningTime="2025-10-09 07:51:09.827733856 +0000 UTC m=+300.520537864" watchObservedRunningTime="2025-10-09 07:51:09.827925392 +0000 UTC m=+300.520729400" Oct 09 07:51:10 crc kubenswrapper[4715]: I1009 07:51:10.129954 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-rl4r4" Oct 09 07:51:10 crc kubenswrapper[4715]: I1009 07:51:10.191371 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-txj6v"] Oct 09 07:51:10 crc kubenswrapper[4715]: I1009 07:51:10.816650 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5kq2" event={"ID":"efd04f5e-635d-422b-ae2a-38096e0ecc44","Type":"ContainerStarted","Data":"791b40899898c52ae90dc98e205ddcc05efe048f1e1b60f06cc0850a33c1fb86"} Oct 09 07:51:10 crc kubenswrapper[4715]: I1009 07:51:10.838381 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g5kq2" podStartSLOduration=2.954522143 podStartE2EDuration="7.838354736s" podCreationTimestamp="2025-10-09 07:51:03 +0000 UTC" firstStartedPulling="2025-10-09 07:51:04.761815887 +0000 UTC m=+295.454619895" lastFinishedPulling="2025-10-09 07:51:09.64564848 +0000 UTC m=+300.338452488" observedRunningTime="2025-10-09 07:51:10.837200611 +0000 UTC m=+301.530004619" watchObservedRunningTime="2025-10-09 07:51:10.838354736 +0000 UTC m=+301.531158744" Oct 09 07:51:10 crc kubenswrapper[4715]: I1009 07:51:10.988204 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:10 crc kubenswrapper[4715]: I1009 07:51:10.988258 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:11 crc kubenswrapper[4715]: I1009 07:51:11.031274 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:11 crc kubenswrapper[4715]: I1009 07:51:11.207609 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:11 crc kubenswrapper[4715]: I1009 07:51:11.208038 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:11 crc kubenswrapper[4715]: I1009 07:51:11.257323 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:11 crc kubenswrapper[4715]: I1009 07:51:11.861286 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9vpkg" Oct 09 07:51:11 crc kubenswrapper[4715]: I1009 07:51:11.863089 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6j97m" Oct 09 07:51:13 crc kubenswrapper[4715]: I1009 07:51:13.388715 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:13 crc kubenswrapper[4715]: I1009 07:51:13.389081 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:13 crc kubenswrapper[4715]: I1009 07:51:13.633147 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:13 crc kubenswrapper[4715]: I1009 07:51:13.633222 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:13 crc kubenswrapper[4715]: I1009 07:51:13.671282 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:13 crc kubenswrapper[4715]: I1009 07:51:13.870437 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-86kz5" Oct 09 07:51:14 crc kubenswrapper[4715]: I1009 07:51:14.430160 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g5kq2" podUID="efd04f5e-635d-422b-ae2a-38096e0ecc44" containerName="registry-server" probeResult="failure" output=< Oct 09 07:51:14 crc kubenswrapper[4715]: timeout: failed to connect service ":50051" within 1s Oct 09 07:51:14 crc kubenswrapper[4715]: > Oct 09 07:51:23 crc kubenswrapper[4715]: I1009 07:51:23.453575 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:23 crc kubenswrapper[4715]: I1009 07:51:23.509348 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g5kq2" Oct 09 07:51:24 crc kubenswrapper[4715]: I1009 07:51:24.972722 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" podUID="b24ee722-8046-4655-a354-4a25a9b16b6a" containerName="oauth-openshift" containerID="cri-o://05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491" gracePeriod=15 Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.374349 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.424062 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj"] Oct 09 07:51:25 crc kubenswrapper[4715]: E1009 07:51:25.424334 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24ee722-8046-4655-a354-4a25a9b16b6a" containerName="oauth-openshift" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.424348 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24ee722-8046-4655-a354-4a25a9b16b6a" containerName="oauth-openshift" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.424526 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24ee722-8046-4655-a354-4a25a9b16b6a" containerName="oauth-openshift" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.425002 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.454363 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj"] Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.481876 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkfcp\" (UniqueName: \"kubernetes.io/projected/b24ee722-8046-4655-a354-4a25a9b16b6a-kube-api-access-tkfcp\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.481930 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-serving-cert\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.481960 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-session\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.481980 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-error\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.482033 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-trusted-ca-bundle\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.482058 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-router-certs\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.482095 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-provider-selection\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.482122 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-service-ca\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.482166 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-audit-policies\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.482190 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-cliconfig\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.482215 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-login\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.482244 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-idp-0-file-data\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.482270 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-ocp-branding-template\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.482289 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b24ee722-8046-4655-a354-4a25a9b16b6a-audit-dir\") pod \"b24ee722-8046-4655-a354-4a25a9b16b6a\" (UID: \"b24ee722-8046-4655-a354-4a25a9b16b6a\") " Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.482545 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b24ee722-8046-4655-a354-4a25a9b16b6a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.483081 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.483454 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.483491 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.483886 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.490544 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.495589 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.497008 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.497235 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24ee722-8046-4655-a354-4a25a9b16b6a-kube-api-access-tkfcp" (OuterVolumeSpecName: "kube-api-access-tkfcp") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "kube-api-access-tkfcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.497238 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.505751 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.509926 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.510300 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.510508 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b24ee722-8046-4655-a354-4a25a9b16b6a" (UID: "b24ee722-8046-4655-a354-4a25a9b16b6a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.583515 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.583608 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-user-template-login\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.583635 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.583676 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bfeba4f5-a026-474b-bfd8-0149da263cd9-audit-dir\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.583862 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.583981 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-service-ca\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584130 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-session\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584172 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584193 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-router-certs\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584213 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584283 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bfeba4f5-a026-474b-bfd8-0149da263cd9-audit-policies\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584359 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x7b2\" (UniqueName: \"kubernetes.io/projected/bfeba4f5-a026-474b-bfd8-0149da263cd9-kube-api-access-2x7b2\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584532 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584592 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-user-template-error\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584804 4715 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584838 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584853 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584866 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584878 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584889 4715 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b24ee722-8046-4655-a354-4a25a9b16b6a-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584900 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkfcp\" (UniqueName: \"kubernetes.io/projected/b24ee722-8046-4655-a354-4a25a9b16b6a-kube-api-access-tkfcp\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584913 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584925 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584935 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584949 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584960 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584972 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.584985 4715 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b24ee722-8046-4655-a354-4a25a9b16b6a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686589 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686644 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-session\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686662 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686684 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-router-certs\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686719 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bfeba4f5-a026-474b-bfd8-0149da263cd9-audit-policies\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686744 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x7b2\" (UniqueName: \"kubernetes.io/projected/bfeba4f5-a026-474b-bfd8-0149da263cd9-kube-api-access-2x7b2\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686771 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-user-template-error\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686785 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686809 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686838 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-user-template-login\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686855 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686880 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bfeba4f5-a026-474b-bfd8-0149da263cd9-audit-dir\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686898 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.686917 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-service-ca\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.688053 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-service-ca\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.688530 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.690059 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bfeba4f5-a026-474b-bfd8-0149da263cd9-audit-policies\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.690911 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-session\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.690913 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-router-certs\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.691022 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bfeba4f5-a026-474b-bfd8-0149da263cd9-audit-dir\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.691538 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.691732 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.692128 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.696061 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.696221 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-user-template-login\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.696372 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.698658 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bfeba4f5-a026-474b-bfd8-0149da263cd9-v4-0-config-user-template-error\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.704951 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x7b2\" (UniqueName: \"kubernetes.io/projected/bfeba4f5-a026-474b-bfd8-0149da263cd9-kube-api-access-2x7b2\") pod \"oauth-openshift-7dbc47cf4b-gkctj\" (UID: \"bfeba4f5-a026-474b-bfd8-0149da263cd9\") " pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.739352 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.903768 4715 generic.go:334] "Generic (PLEG): container finished" podID="b24ee722-8046-4655-a354-4a25a9b16b6a" containerID="05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491" exitCode=0 Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.903966 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" event={"ID":"b24ee722-8046-4655-a354-4a25a9b16b6a","Type":"ContainerDied","Data":"05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491"} Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.904168 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" event={"ID":"b24ee722-8046-4655-a354-4a25a9b16b6a","Type":"ContainerDied","Data":"d1635420df7b0c74e79b0dd52c405e76917a494a39be592acb58e20267556b73"} Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.904103 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-q5ck7" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.904201 4715 scope.go:117] "RemoveContainer" containerID="05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.951950 4715 scope.go:117] "RemoveContainer" containerID="05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491" Oct 09 07:51:25 crc kubenswrapper[4715]: E1009 07:51:25.953953 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491\": container with ID starting with 05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491 not found: ID does not exist" containerID="05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.954009 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491"} err="failed to get container status \"05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491\": rpc error: code = NotFound desc = could not find container \"05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491\": container with ID starting with 05d59aff999dd0ffe788c77e9aa8159ca10b4c47090b11f9b8d124f62c16d491 not found: ID does not exist" Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.972900 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q5ck7"] Oct 09 07:51:25 crc kubenswrapper[4715]: I1009 07:51:25.987105 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-q5ck7"] Oct 09 07:51:26 crc kubenswrapper[4715]: I1009 07:51:26.145403 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24ee722-8046-4655-a354-4a25a9b16b6a" path="/var/lib/kubelet/pods/b24ee722-8046-4655-a354-4a25a9b16b6a/volumes" Oct 09 07:51:26 crc kubenswrapper[4715]: I1009 07:51:26.200401 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj"] Oct 09 07:51:26 crc kubenswrapper[4715]: I1009 07:51:26.912672 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" event={"ID":"bfeba4f5-a026-474b-bfd8-0149da263cd9","Type":"ContainerStarted","Data":"24b8cbb18f654c9dd35532b8ccae8529339b70459c3b570e210714c781dd6f9a"} Oct 09 07:51:26 crc kubenswrapper[4715]: I1009 07:51:26.913132 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" event={"ID":"bfeba4f5-a026-474b-bfd8-0149da263cd9","Type":"ContainerStarted","Data":"2544eb31b8f3de2e264a20f9ef2e28d4388d3926525a28bdd50b5262374a2842"} Oct 09 07:51:26 crc kubenswrapper[4715]: I1009 07:51:26.913492 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:26 crc kubenswrapper[4715]: I1009 07:51:26.919978 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" Oct 09 07:51:26 crc kubenswrapper[4715]: I1009 07:51:26.932261 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7dbc47cf4b-gkctj" podStartSLOduration=27.932245528 podStartE2EDuration="27.932245528s" podCreationTimestamp="2025-10-09 07:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:51:26.930781083 +0000 UTC m=+317.623585091" watchObservedRunningTime="2025-10-09 07:51:26.932245528 +0000 UTC m=+317.625049536" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.290210 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" podUID="915f6370-d5b2-4c9e-a1b1-c3146612b3ce" containerName="registry" containerID="cri-o://2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c" gracePeriod=30 Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.705070 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.842241 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-installation-pull-secrets\") pod \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.842303 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-trusted-ca\") pod \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.842334 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-registry-certificates\") pod \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.842625 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.842693 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-registry-tls\") pod \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.842711 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-ca-trust-extracted\") pod \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.842757 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wkk9\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-kube-api-access-4wkk9\") pod \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.842815 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-bound-sa-token\") pod \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\" (UID: \"915f6370-d5b2-4c9e-a1b1-c3146612b3ce\") " Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.843716 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "915f6370-d5b2-4c9e-a1b1-c3146612b3ce" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.844646 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "915f6370-d5b2-4c9e-a1b1-c3146612b3ce" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.850195 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-kube-api-access-4wkk9" (OuterVolumeSpecName: "kube-api-access-4wkk9") pod "915f6370-d5b2-4c9e-a1b1-c3146612b3ce" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce"). InnerVolumeSpecName "kube-api-access-4wkk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.850221 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "915f6370-d5b2-4c9e-a1b1-c3146612b3ce" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.851126 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "915f6370-d5b2-4c9e-a1b1-c3146612b3ce" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.853024 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "915f6370-d5b2-4c9e-a1b1-c3146612b3ce" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.861906 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "915f6370-d5b2-4c9e-a1b1-c3146612b3ce" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.862190 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "915f6370-d5b2-4c9e-a1b1-c3146612b3ce" (UID: "915f6370-d5b2-4c9e-a1b1-c3146612b3ce"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.943875 4715 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.943916 4715 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.943929 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wkk9\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-kube-api-access-4wkk9\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.943944 4715 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.943956 4715 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.943968 4715 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.943982 4715 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/915f6370-d5b2-4c9e-a1b1-c3146612b3ce-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.968568 4715 generic.go:334] "Generic (PLEG): container finished" podID="915f6370-d5b2-4c9e-a1b1-c3146612b3ce" containerID="2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c" exitCode=0 Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.968641 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" event={"ID":"915f6370-d5b2-4c9e-a1b1-c3146612b3ce","Type":"ContainerDied","Data":"2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c"} Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.968656 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.968696 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-txj6v" event={"ID":"915f6370-d5b2-4c9e-a1b1-c3146612b3ce","Type":"ContainerDied","Data":"c91ec32804e7dd9724bc0abc250bfbd6877343c051977727a5ecc3e6ec7a7d9f"} Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.968729 4715 scope.go:117] "RemoveContainer" containerID="2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.989171 4715 scope.go:117] "RemoveContainer" containerID="2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c" Oct 09 07:51:35 crc kubenswrapper[4715]: E1009 07:51:35.989601 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c\": container with ID starting with 2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c not found: ID does not exist" containerID="2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.989634 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c"} err="failed to get container status \"2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c\": rpc error: code = NotFound desc = could not find container \"2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c\": container with ID starting with 2f7edcb33eafe2e6058db98cb4fa8af63c0f54dfdef938cb82ee47e42789d67c not found: ID does not exist" Oct 09 07:51:35 crc kubenswrapper[4715]: I1009 07:51:35.996796 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-txj6v"] Oct 09 07:51:36 crc kubenswrapper[4715]: I1009 07:51:36.000604 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-txj6v"] Oct 09 07:51:36 crc kubenswrapper[4715]: I1009 07:51:36.145180 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="915f6370-d5b2-4c9e-a1b1-c3146612b3ce" path="/var/lib/kubelet/pods/915f6370-d5b2-4c9e-a1b1-c3146612b3ce/volumes" Oct 09 07:52:16 crc kubenswrapper[4715]: I1009 07:52:16.753994 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 07:52:16 crc kubenswrapper[4715]: I1009 07:52:16.754733 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 07:52:46 crc kubenswrapper[4715]: I1009 07:52:46.754013 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 07:52:46 crc kubenswrapper[4715]: I1009 07:52:46.754793 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 07:53:16 crc kubenswrapper[4715]: I1009 07:53:16.754039 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 07:53:16 crc kubenswrapper[4715]: I1009 07:53:16.754795 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 07:53:16 crc kubenswrapper[4715]: I1009 07:53:16.754855 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:53:16 crc kubenswrapper[4715]: I1009 07:53:16.755470 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b010400ee7dba57a3343bec5cd3be68030f4519bc1714b489d54ec14a33cc803"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 07:53:16 crc kubenswrapper[4715]: I1009 07:53:16.755546 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://b010400ee7dba57a3343bec5cd3be68030f4519bc1714b489d54ec14a33cc803" gracePeriod=600 Oct 09 07:53:17 crc kubenswrapper[4715]: I1009 07:53:17.583828 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="b010400ee7dba57a3343bec5cd3be68030f4519bc1714b489d54ec14a33cc803" exitCode=0 Oct 09 07:53:17 crc kubenswrapper[4715]: I1009 07:53:17.583915 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"b010400ee7dba57a3343bec5cd3be68030f4519bc1714b489d54ec14a33cc803"} Oct 09 07:53:17 crc kubenswrapper[4715]: I1009 07:53:17.583981 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"a6cfe3d63903269fab164da1154df39ba0aa750858dad3414bb1690252e4ef7d"} Oct 09 07:53:17 crc kubenswrapper[4715]: I1009 07:53:17.584020 4715 scope.go:117] "RemoveContainer" containerID="4eab9be18db2c21136a797167f3282bba0639147e04085d9c930fe113cd5bc94" Oct 09 07:55:46 crc kubenswrapper[4715]: I1009 07:55:46.753537 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 07:55:46 crc kubenswrapper[4715]: I1009 07:55:46.754197 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 07:56:16 crc kubenswrapper[4715]: I1009 07:56:16.753653 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 07:56:16 crc kubenswrapper[4715]: I1009 07:56:16.754493 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.496758 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-4sgtl"] Oct 09 07:56:31 crc kubenswrapper[4715]: E1009 07:56:31.497600 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="915f6370-d5b2-4c9e-a1b1-c3146612b3ce" containerName="registry" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.497614 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="915f6370-d5b2-4c9e-a1b1-c3146612b3ce" containerName="registry" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.497705 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="915f6370-d5b2-4c9e-a1b1-c3146612b3ce" containerName="registry" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.498101 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-4sgtl" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.500533 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vxwbp"] Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.501163 4715 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-4dd7v" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.501168 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.501407 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-vxwbp" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.505219 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.507040 4715 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-58tvb" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.515357 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-4sgtl"] Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.519865 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vxwbp"] Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.552314 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-27vx6"] Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.553174 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-27vx6" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.555569 4715 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pjjhj" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.557846 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-27vx6"] Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.605147 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzm2p\" (UniqueName: \"kubernetes.io/projected/b46c8c22-ef65-4617-90a9-bcef0954a010-kube-api-access-bzm2p\") pod \"cert-manager-5b446d88c5-vxwbp\" (UID: \"b46c8c22-ef65-4617-90a9-bcef0954a010\") " pod="cert-manager/cert-manager-5b446d88c5-vxwbp" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.605334 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8pdh\" (UniqueName: \"kubernetes.io/projected/9c7c75d0-8444-4edb-b653-5bc079b11d51-kube-api-access-q8pdh\") pod \"cert-manager-cainjector-7f985d654d-4sgtl\" (UID: \"9c7c75d0-8444-4edb-b653-5bc079b11d51\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-4sgtl" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.706975 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqkml\" (UniqueName: \"kubernetes.io/projected/acf4388b-06e6-4576-ae4d-b67ccda0c1ac-kube-api-access-gqkml\") pod \"cert-manager-webhook-5655c58dd6-27vx6\" (UID: \"acf4388b-06e6-4576-ae4d-b67ccda0c1ac\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-27vx6" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.707344 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8pdh\" (UniqueName: \"kubernetes.io/projected/9c7c75d0-8444-4edb-b653-5bc079b11d51-kube-api-access-q8pdh\") pod \"cert-manager-cainjector-7f985d654d-4sgtl\" (UID: \"9c7c75d0-8444-4edb-b653-5bc079b11d51\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-4sgtl" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.707376 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzm2p\" (UniqueName: \"kubernetes.io/projected/b46c8c22-ef65-4617-90a9-bcef0954a010-kube-api-access-bzm2p\") pod \"cert-manager-5b446d88c5-vxwbp\" (UID: \"b46c8c22-ef65-4617-90a9-bcef0954a010\") " pod="cert-manager/cert-manager-5b446d88c5-vxwbp" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.733588 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8pdh\" (UniqueName: \"kubernetes.io/projected/9c7c75d0-8444-4edb-b653-5bc079b11d51-kube-api-access-q8pdh\") pod \"cert-manager-cainjector-7f985d654d-4sgtl\" (UID: \"9c7c75d0-8444-4edb-b653-5bc079b11d51\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-4sgtl" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.735489 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzm2p\" (UniqueName: \"kubernetes.io/projected/b46c8c22-ef65-4617-90a9-bcef0954a010-kube-api-access-bzm2p\") pod \"cert-manager-5b446d88c5-vxwbp\" (UID: \"b46c8c22-ef65-4617-90a9-bcef0954a010\") " pod="cert-manager/cert-manager-5b446d88c5-vxwbp" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.808732 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqkml\" (UniqueName: \"kubernetes.io/projected/acf4388b-06e6-4576-ae4d-b67ccda0c1ac-kube-api-access-gqkml\") pod \"cert-manager-webhook-5655c58dd6-27vx6\" (UID: \"acf4388b-06e6-4576-ae4d-b67ccda0c1ac\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-27vx6" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.815666 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-4sgtl" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.823074 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-vxwbp" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.827513 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqkml\" (UniqueName: \"kubernetes.io/projected/acf4388b-06e6-4576-ae4d-b67ccda0c1ac-kube-api-access-gqkml\") pod \"cert-manager-webhook-5655c58dd6-27vx6\" (UID: \"acf4388b-06e6-4576-ae4d-b67ccda0c1ac\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-27vx6" Oct 09 07:56:31 crc kubenswrapper[4715]: I1009 07:56:31.871321 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-27vx6" Oct 09 07:56:32 crc kubenswrapper[4715]: I1009 07:56:32.013085 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-vxwbp"] Oct 09 07:56:32 crc kubenswrapper[4715]: I1009 07:56:32.030795 4715 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 07:56:32 crc kubenswrapper[4715]: I1009 07:56:32.069735 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-4sgtl"] Oct 09 07:56:32 crc kubenswrapper[4715]: W1009 07:56:32.083611 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c7c75d0_8444_4edb_b653_5bc079b11d51.slice/crio-17c5a6b61323000115e758a388c25d64f3be746976aa70223e2961673d1d2e22 WatchSource:0}: Error finding container 17c5a6b61323000115e758a388c25d64f3be746976aa70223e2961673d1d2e22: Status 404 returned error can't find the container with id 17c5a6b61323000115e758a388c25d64f3be746976aa70223e2961673d1d2e22 Oct 09 07:56:32 crc kubenswrapper[4715]: I1009 07:56:32.114656 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-27vx6"] Oct 09 07:56:32 crc kubenswrapper[4715]: W1009 07:56:32.119043 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacf4388b_06e6_4576_ae4d_b67ccda0c1ac.slice/crio-44a9eb1561db167805eed0df98c11307cec0e7af84599816605818232152873c WatchSource:0}: Error finding container 44a9eb1561db167805eed0df98c11307cec0e7af84599816605818232152873c: Status 404 returned error can't find the container with id 44a9eb1561db167805eed0df98c11307cec0e7af84599816605818232152873c Oct 09 07:56:32 crc kubenswrapper[4715]: I1009 07:56:32.761784 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-27vx6" event={"ID":"acf4388b-06e6-4576-ae4d-b67ccda0c1ac","Type":"ContainerStarted","Data":"44a9eb1561db167805eed0df98c11307cec0e7af84599816605818232152873c"} Oct 09 07:56:32 crc kubenswrapper[4715]: I1009 07:56:32.765256 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-vxwbp" event={"ID":"b46c8c22-ef65-4617-90a9-bcef0954a010","Type":"ContainerStarted","Data":"a3a14929c59eb1173524d01984aa3ecd038f4943b1d1d49d31e59918e09ecd77"} Oct 09 07:56:32 crc kubenswrapper[4715]: I1009 07:56:32.769815 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-4sgtl" event={"ID":"9c7c75d0-8444-4edb-b653-5bc079b11d51","Type":"ContainerStarted","Data":"17c5a6b61323000115e758a388c25d64f3be746976aa70223e2961673d1d2e22"} Oct 09 07:56:35 crc kubenswrapper[4715]: I1009 07:56:35.786270 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-4sgtl" event={"ID":"9c7c75d0-8444-4edb-b653-5bc079b11d51","Type":"ContainerStarted","Data":"df2534ba059b7b97139c2eabeec70f48384d724636a2bbf33a2a1d631e9fe43e"} Oct 09 07:56:35 crc kubenswrapper[4715]: I1009 07:56:35.788547 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-27vx6" event={"ID":"acf4388b-06e6-4576-ae4d-b67ccda0c1ac","Type":"ContainerStarted","Data":"444e67eb6576182ec9dd5b79d0d16700cd003780e09aeb01e0b74298ad54dfe6"} Oct 09 07:56:35 crc kubenswrapper[4715]: I1009 07:56:35.789048 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-27vx6" Oct 09 07:56:35 crc kubenswrapper[4715]: I1009 07:56:35.809131 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-4sgtl" podStartSLOduration=1.712206568 podStartE2EDuration="4.809108176s" podCreationTimestamp="2025-10-09 07:56:31 +0000 UTC" firstStartedPulling="2025-10-09 07:56:32.090725487 +0000 UTC m=+622.783529505" lastFinishedPulling="2025-10-09 07:56:35.187627105 +0000 UTC m=+625.880431113" observedRunningTime="2025-10-09 07:56:35.805690086 +0000 UTC m=+626.498494104" watchObservedRunningTime="2025-10-09 07:56:35.809108176 +0000 UTC m=+626.501912174" Oct 09 07:56:35 crc kubenswrapper[4715]: I1009 07:56:35.822584 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-27vx6" podStartSLOduration=1.740945246 podStartE2EDuration="4.822561908s" podCreationTimestamp="2025-10-09 07:56:31 +0000 UTC" firstStartedPulling="2025-10-09 07:56:32.120625229 +0000 UTC m=+622.813429237" lastFinishedPulling="2025-10-09 07:56:35.202241891 +0000 UTC m=+625.895045899" observedRunningTime="2025-10-09 07:56:35.819684494 +0000 UTC m=+626.512488502" watchObservedRunningTime="2025-10-09 07:56:35.822561908 +0000 UTC m=+626.515365916" Oct 09 07:56:36 crc kubenswrapper[4715]: I1009 07:56:36.796738 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-vxwbp" event={"ID":"b46c8c22-ef65-4617-90a9-bcef0954a010","Type":"ContainerStarted","Data":"803264c16bf3ae356bfd87ef4a1b218cdcf3fb6bdbc192b50f1fcc63fac0cc11"} Oct 09 07:56:36 crc kubenswrapper[4715]: I1009 07:56:36.817450 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-vxwbp" podStartSLOduration=1.557519847 podStartE2EDuration="5.817396194s" podCreationTimestamp="2025-10-09 07:56:31 +0000 UTC" firstStartedPulling="2025-10-09 07:56:32.030525952 +0000 UTC m=+622.723329960" lastFinishedPulling="2025-10-09 07:56:36.290402299 +0000 UTC m=+626.983206307" observedRunningTime="2025-10-09 07:56:36.817352303 +0000 UTC m=+627.510156341" watchObservedRunningTime="2025-10-09 07:56:36.817396194 +0000 UTC m=+627.510200202" Oct 09 07:56:41 crc kubenswrapper[4715]: I1009 07:56:41.874277 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-27vx6" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.098369 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z9ztn"] Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.099419 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovn-controller" containerID="cri-o://e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab" gracePeriod=30 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.099539 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="nbdb" containerID="cri-o://b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5" gracePeriod=30 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.099633 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="kube-rbac-proxy-node" containerID="cri-o://76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b" gracePeriod=30 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.099725 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="northd" containerID="cri-o://80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff" gracePeriod=30 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.099720 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovn-acl-logging" containerID="cri-o://1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb" gracePeriod=30 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.099731 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="sbdb" containerID="cri-o://85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2" gracePeriod=30 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.099995 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4" gracePeriod=30 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.131801 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" containerID="cri-o://85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e" gracePeriod=30 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.456041 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/3.log" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.458393 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovn-acl-logging/0.log" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.458889 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovn-controller/0.log" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.459341 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520058 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zzlss"] Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520363 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovn-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520385 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovn-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520399 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="sbdb" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520410 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="sbdb" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520437 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520447 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520455 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520462 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520475 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="northd" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520482 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="northd" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520493 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520500 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520508 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovn-acl-logging" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520514 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovn-acl-logging" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520523 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="kubecfg-setup" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520530 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="kubecfg-setup" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520545 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520553 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520565 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="nbdb" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520572 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="nbdb" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520586 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="kube-rbac-proxy-node" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520591 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="kube-rbac-proxy-node" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520701 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520711 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520717 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520727 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="kube-rbac-proxy-ovn-metrics" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520735 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovn-acl-logging" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520743 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="kube-rbac-proxy-node" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520754 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovn-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520761 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="sbdb" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520769 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="northd" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520776 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="nbdb" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520860 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520867 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.520877 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520884 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.520967 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.521136 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerName="ovnkube-controller" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.522734 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.648561 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-log-socket\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.648803 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-node-log\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.648899 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-kubelet\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.648674 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-log-socket" (OuterVolumeSpecName: "log-socket") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.648822 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-node-log" (OuterVolumeSpecName: "node-log") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.648998 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-env-overrides\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649075 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649235 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4tdb\" (UniqueName: \"kubernetes.io/projected/1d6cb14a-7329-4a80-aff2-acd9142558d3-kube-api-access-k4tdb\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649291 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-openvswitch\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649343 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-ovn\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649394 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649463 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovnkube-script-lib\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649488 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649533 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-systemd-units\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649618 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649622 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-cni-netd\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649759 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-var-lib-openvswitch\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649843 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovn-node-metrics-cert\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649920 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-run-ovn-kubernetes\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649983 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-cni-bin\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649710 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649809 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.649962 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650021 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650030 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650057 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650163 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-systemd\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650201 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovnkube-config\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650253 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-run-netns\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650290 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-slash\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650326 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-etc-openvswitch\") pod \"1d6cb14a-7329-4a80-aff2-acd9142558d3\" (UID: \"1d6cb14a-7329-4a80-aff2-acd9142558d3\") " Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650441 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-slash" (OuterVolumeSpecName: "host-slash") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650440 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650468 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650629 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650723 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-kubelet\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650747 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650821 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-etc-openvswitch\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650847 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b34064-7c0e-43b5-8b74-023ac1da933d-ovn-node-metrics-cert\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650882 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-log-socket\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650912 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2w6\" (UniqueName: \"kubernetes.io/projected/37b34064-7c0e-43b5-8b74-023ac1da933d-kube-api-access-xz2w6\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.650940 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/37b34064-7c0e-43b5-8b74-023ac1da933d-ovnkube-script-lib\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651009 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-cni-bin\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651045 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b34064-7c0e-43b5-8b74-023ac1da933d-ovnkube-config\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651068 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b34064-7c0e-43b5-8b74-023ac1da933d-env-overrides\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651200 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-slash\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651336 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-node-log\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651453 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-run-systemd\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651542 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-run-netns\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651631 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-run-ovn-kubernetes\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651702 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651782 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-run-openvswitch\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651842 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-systemd-units\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651908 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-var-lib-openvswitch\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.651994 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-run-ovn\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652101 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-cni-netd\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652448 4715 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652484 4715 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652505 4715 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652525 4715 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652543 4715 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652562 4715 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652580 4715 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652597 4715 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-slash\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652621 4715 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652646 4715 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-log-socket\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652669 4715 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-node-log\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652697 4715 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652723 4715 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652740 4715 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652756 4715 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.652772 4715 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.653035 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.656680 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.656834 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d6cb14a-7329-4a80-aff2-acd9142558d3-kube-api-access-k4tdb" (OuterVolumeSpecName: "kube-api-access-k4tdb") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "kube-api-access-k4tdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.668071 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1d6cb14a-7329-4a80-aff2-acd9142558d3" (UID: "1d6cb14a-7329-4a80-aff2-acd9142558d3"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.753676 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b34064-7c0e-43b5-8b74-023ac1da933d-ovn-node-metrics-cert\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.753764 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-log-socket\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.753812 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2w6\" (UniqueName: \"kubernetes.io/projected/37b34064-7c0e-43b5-8b74-023ac1da933d-kube-api-access-xz2w6\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.753855 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/37b34064-7c0e-43b5-8b74-023ac1da933d-ovnkube-script-lib\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.753893 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-cni-bin\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.753937 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b34064-7c0e-43b5-8b74-023ac1da933d-ovnkube-config\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754002 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b34064-7c0e-43b5-8b74-023ac1da933d-env-overrides\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754055 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-slash\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754085 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-cni-bin\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754115 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-node-log\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754148 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-run-systemd\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754181 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-run-netns\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754220 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-run-ovn-kubernetes\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754256 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754297 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-run-openvswitch\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754333 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-systemd-units\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754363 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-var-lib-openvswitch\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754398 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-run-ovn\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754404 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-run-netns\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754464 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-cni-netd\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754483 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-slash\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754513 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-node-log\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754510 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-kubelet\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754577 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-kubelet\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754601 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-etc-openvswitch\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754578 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-etc-openvswitch\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754634 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-run-systemd\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754691 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754712 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-var-lib-openvswitch\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754728 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-run-openvswitch\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754777 4715 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1d6cb14a-7329-4a80-aff2-acd9142558d3-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754802 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-cni-netd\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754823 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-run-ovn\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754830 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-host-run-ovn-kubernetes\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754847 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-systemd-units\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754864 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4tdb\" (UniqueName: \"kubernetes.io/projected/1d6cb14a-7329-4a80-aff2-acd9142558d3-kube-api-access-k4tdb\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754877 4715 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d6cb14a-7329-4a80-aff2-acd9142558d3-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754886 4715 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1d6cb14a-7329-4a80-aff2-acd9142558d3-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.754976 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b34064-7c0e-43b5-8b74-023ac1da933d-env-overrides\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.755088 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/37b34064-7c0e-43b5-8b74-023ac1da933d-log-socket\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.755623 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/37b34064-7c0e-43b5-8b74-023ac1da933d-ovnkube-script-lib\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.755821 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b34064-7c0e-43b5-8b74-023ac1da933d-ovnkube-config\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.759894 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b34064-7c0e-43b5-8b74-023ac1da933d-ovn-node-metrics-cert\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.771044 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2w6\" (UniqueName: \"kubernetes.io/projected/37b34064-7c0e-43b5-8b74-023ac1da933d-kube-api-access-xz2w6\") pod \"ovnkube-node-zzlss\" (UID: \"37b34064-7c0e-43b5-8b74-023ac1da933d\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.833926 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6vp75_6e61f2cb-cd6d-46d6-bbb6-dd99919b893d/kube-multus/2.log" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.834640 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6vp75_6e61f2cb-cd6d-46d6-bbb6-dd99919b893d/kube-multus/1.log" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.834708 4715 generic.go:334] "Generic (PLEG): container finished" podID="6e61f2cb-cd6d-46d6-bbb6-dd99919b893d" containerID="e5bd879138998ee05f837dc613e67592e7abe570a8c55d71b8fa18446d82d746" exitCode=2 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.834811 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6vp75" event={"ID":"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d","Type":"ContainerDied","Data":"e5bd879138998ee05f837dc613e67592e7abe570a8c55d71b8fa18446d82d746"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.834885 4715 scope.go:117] "RemoveContainer" containerID="4e02a5b9a040e142c2a3f8ca488f0de0e42b0e01fff8a9987ea1ee5c354b1e31" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.835333 4715 scope.go:117] "RemoveContainer" containerID="e5bd879138998ee05f837dc613e67592e7abe570a8c55d71b8fa18446d82d746" Oct 09 07:56:42 crc kubenswrapper[4715]: E1009 07:56:42.835529 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6vp75_openshift-multus(6e61f2cb-cd6d-46d6-bbb6-dd99919b893d)\"" pod="openshift-multus/multus-6vp75" podUID="6e61f2cb-cd6d-46d6-bbb6-dd99919b893d" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.836672 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.838509 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovnkube-controller/3.log" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.841771 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovn-acl-logging/0.log" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842167 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z9ztn_1d6cb14a-7329-4a80-aff2-acd9142558d3/ovn-controller/0.log" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842493 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e" exitCode=0 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842517 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2" exitCode=0 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842526 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5" exitCode=0 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842537 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff" exitCode=0 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842545 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4" exitCode=0 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842553 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b" exitCode=0 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842561 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb" exitCode=143 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842570 4715 generic.go:334] "Generic (PLEG): container finished" podID="1d6cb14a-7329-4a80-aff2-acd9142558d3" containerID="e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab" exitCode=143 Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842594 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842627 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842640 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842652 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842663 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842674 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842724 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842736 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842744 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842751 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842758 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842764 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842771 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842779 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842786 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842793 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842802 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842812 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842819 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842826 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842833 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842839 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842844 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842850 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842855 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842859 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842864 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842871 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842879 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842885 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842890 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842895 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842900 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842905 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842911 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842916 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842921 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842927 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842934 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" event={"ID":"1d6cb14a-7329-4a80-aff2-acd9142558d3","Type":"ContainerDied","Data":"21634b6b9a5f51a41516485e45af2f2a5df4f2c3bcbcbe10016df9e5bad53916"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842941 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842965 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842970 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842976 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842981 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842987 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842992 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.842997 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.843003 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.843008 4715 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3"} Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.843086 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z9ztn" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.877752 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z9ztn"] Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.881330 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z9ztn"] Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.887249 4715 scope.go:117] "RemoveContainer" containerID="85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.902926 4715 scope.go:117] "RemoveContainer" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.922494 4715 scope.go:117] "RemoveContainer" containerID="85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.936939 4715 scope.go:117] "RemoveContainer" containerID="b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.961471 4715 scope.go:117] "RemoveContainer" containerID="80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.974327 4715 scope.go:117] "RemoveContainer" containerID="b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.985163 4715 scope.go:117] "RemoveContainer" containerID="76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b" Oct 09 07:56:42 crc kubenswrapper[4715]: I1009 07:56:42.996187 4715 scope.go:117] "RemoveContainer" containerID="1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.008571 4715 scope.go:117] "RemoveContainer" containerID="e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.082865 4715 scope.go:117] "RemoveContainer" containerID="ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.107574 4715 scope.go:117] "RemoveContainer" containerID="85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e" Oct 09 07:56:43 crc kubenswrapper[4715]: E1009 07:56:43.108370 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e\": container with ID starting with 85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e not found: ID does not exist" containerID="85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.108464 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e"} err="failed to get container status \"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e\": rpc error: code = NotFound desc = could not find container \"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e\": container with ID starting with 85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.108509 4715 scope.go:117] "RemoveContainer" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" Oct 09 07:56:43 crc kubenswrapper[4715]: E1009 07:56:43.109059 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\": container with ID starting with e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c not found: ID does not exist" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.109091 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c"} err="failed to get container status \"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\": rpc error: code = NotFound desc = could not find container \"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\": container with ID starting with e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.109113 4715 scope.go:117] "RemoveContainer" containerID="85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2" Oct 09 07:56:43 crc kubenswrapper[4715]: E1009 07:56:43.109516 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\": container with ID starting with 85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2 not found: ID does not exist" containerID="85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.109536 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2"} err="failed to get container status \"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\": rpc error: code = NotFound desc = could not find container \"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\": container with ID starting with 85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.109548 4715 scope.go:117] "RemoveContainer" containerID="b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5" Oct 09 07:56:43 crc kubenswrapper[4715]: E1009 07:56:43.109812 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\": container with ID starting with b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5 not found: ID does not exist" containerID="b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.109870 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5"} err="failed to get container status \"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\": rpc error: code = NotFound desc = could not find container \"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\": container with ID starting with b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.109920 4715 scope.go:117] "RemoveContainer" containerID="80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff" Oct 09 07:56:43 crc kubenswrapper[4715]: E1009 07:56:43.110377 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\": container with ID starting with 80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff not found: ID does not exist" containerID="80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.110402 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff"} err="failed to get container status \"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\": rpc error: code = NotFound desc = could not find container \"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\": container with ID starting with 80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.110437 4715 scope.go:117] "RemoveContainer" containerID="b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4" Oct 09 07:56:43 crc kubenswrapper[4715]: E1009 07:56:43.110792 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\": container with ID starting with b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4 not found: ID does not exist" containerID="b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.110838 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4"} err="failed to get container status \"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\": rpc error: code = NotFound desc = could not find container \"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\": container with ID starting with b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.110876 4715 scope.go:117] "RemoveContainer" containerID="76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b" Oct 09 07:56:43 crc kubenswrapper[4715]: E1009 07:56:43.111272 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\": container with ID starting with 76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b not found: ID does not exist" containerID="76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.111303 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b"} err="failed to get container status \"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\": rpc error: code = NotFound desc = could not find container \"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\": container with ID starting with 76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.111326 4715 scope.go:117] "RemoveContainer" containerID="1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb" Oct 09 07:56:43 crc kubenswrapper[4715]: E1009 07:56:43.111617 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\": container with ID starting with 1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb not found: ID does not exist" containerID="1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.111677 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb"} err="failed to get container status \"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\": rpc error: code = NotFound desc = could not find container \"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\": container with ID starting with 1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.111721 4715 scope.go:117] "RemoveContainer" containerID="e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab" Oct 09 07:56:43 crc kubenswrapper[4715]: E1009 07:56:43.112102 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\": container with ID starting with e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab not found: ID does not exist" containerID="e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.112137 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab"} err="failed to get container status \"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\": rpc error: code = NotFound desc = could not find container \"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\": container with ID starting with e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.112156 4715 scope.go:117] "RemoveContainer" containerID="ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3" Oct 09 07:56:43 crc kubenswrapper[4715]: E1009 07:56:43.112583 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\": container with ID starting with ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3 not found: ID does not exist" containerID="ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.112693 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3"} err="failed to get container status \"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\": rpc error: code = NotFound desc = could not find container \"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\": container with ID starting with ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.112838 4715 scope.go:117] "RemoveContainer" containerID="85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.113340 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e"} err="failed to get container status \"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e\": rpc error: code = NotFound desc = could not find container \"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e\": container with ID starting with 85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.113366 4715 scope.go:117] "RemoveContainer" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.113805 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c"} err="failed to get container status \"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\": rpc error: code = NotFound desc = could not find container \"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\": container with ID starting with e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.113932 4715 scope.go:117] "RemoveContainer" containerID="85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.114504 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2"} err="failed to get container status \"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\": rpc error: code = NotFound desc = could not find container \"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\": container with ID starting with 85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.114608 4715 scope.go:117] "RemoveContainer" containerID="b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.115033 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5"} err="failed to get container status \"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\": rpc error: code = NotFound desc = could not find container \"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\": container with ID starting with b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.115062 4715 scope.go:117] "RemoveContainer" containerID="80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.115472 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff"} err="failed to get container status \"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\": rpc error: code = NotFound desc = could not find container \"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\": container with ID starting with 80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.115509 4715 scope.go:117] "RemoveContainer" containerID="b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.117017 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4"} err="failed to get container status \"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\": rpc error: code = NotFound desc = could not find container \"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\": container with ID starting with b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.117050 4715 scope.go:117] "RemoveContainer" containerID="76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.117340 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b"} err="failed to get container status \"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\": rpc error: code = NotFound desc = could not find container \"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\": container with ID starting with 76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.117366 4715 scope.go:117] "RemoveContainer" containerID="1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.117785 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb"} err="failed to get container status \"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\": rpc error: code = NotFound desc = could not find container \"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\": container with ID starting with 1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.117839 4715 scope.go:117] "RemoveContainer" containerID="e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.118231 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab"} err="failed to get container status \"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\": rpc error: code = NotFound desc = could not find container \"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\": container with ID starting with e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.118265 4715 scope.go:117] "RemoveContainer" containerID="ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.118625 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3"} err="failed to get container status \"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\": rpc error: code = NotFound desc = could not find container \"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\": container with ID starting with ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.118646 4715 scope.go:117] "RemoveContainer" containerID="85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.119050 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e"} err="failed to get container status \"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e\": rpc error: code = NotFound desc = could not find container \"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e\": container with ID starting with 85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.119079 4715 scope.go:117] "RemoveContainer" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.119382 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c"} err="failed to get container status \"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\": rpc error: code = NotFound desc = could not find container \"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\": container with ID starting with e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.119448 4715 scope.go:117] "RemoveContainer" containerID="85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.119798 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2"} err="failed to get container status \"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\": rpc error: code = NotFound desc = could not find container \"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\": container with ID starting with 85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.119818 4715 scope.go:117] "RemoveContainer" containerID="b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.120086 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5"} err="failed to get container status \"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\": rpc error: code = NotFound desc = could not find container \"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\": container with ID starting with b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.120116 4715 scope.go:117] "RemoveContainer" containerID="80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.120368 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff"} err="failed to get container status \"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\": rpc error: code = NotFound desc = could not find container \"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\": container with ID starting with 80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.120390 4715 scope.go:117] "RemoveContainer" containerID="b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.120899 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4"} err="failed to get container status \"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\": rpc error: code = NotFound desc = could not find container \"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\": container with ID starting with b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.120943 4715 scope.go:117] "RemoveContainer" containerID="76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.121450 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b"} err="failed to get container status \"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\": rpc error: code = NotFound desc = could not find container \"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\": container with ID starting with 76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.121481 4715 scope.go:117] "RemoveContainer" containerID="1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.121827 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb"} err="failed to get container status \"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\": rpc error: code = NotFound desc = could not find container \"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\": container with ID starting with 1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.121851 4715 scope.go:117] "RemoveContainer" containerID="e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.122207 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab"} err="failed to get container status \"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\": rpc error: code = NotFound desc = could not find container \"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\": container with ID starting with e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.122233 4715 scope.go:117] "RemoveContainer" containerID="ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.122541 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3"} err="failed to get container status \"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\": rpc error: code = NotFound desc = could not find container \"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\": container with ID starting with ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.122567 4715 scope.go:117] "RemoveContainer" containerID="85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.122810 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e"} err="failed to get container status \"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e\": rpc error: code = NotFound desc = could not find container \"85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e\": container with ID starting with 85d731fb7590b113c847a40e343c9b81d2da112b32c0bf11cfef3b06302ba95e not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.122839 4715 scope.go:117] "RemoveContainer" containerID="e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.123413 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c"} err="failed to get container status \"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\": rpc error: code = NotFound desc = could not find container \"e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c\": container with ID starting with e9b9653decfa58510f011f69cf54290119540ca7cad7a56eb6da5440c4ff5f9c not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.123496 4715 scope.go:117] "RemoveContainer" containerID="85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.124011 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2"} err="failed to get container status \"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\": rpc error: code = NotFound desc = could not find container \"85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2\": container with ID starting with 85e72cf3afdc2a2dbf628cee337b3f44fc5007a6742e546d8dd83ae1e46715a2 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.124047 4715 scope.go:117] "RemoveContainer" containerID="b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.124283 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5"} err="failed to get container status \"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\": rpc error: code = NotFound desc = could not find container \"b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5\": container with ID starting with b349636e7681c8961e1e395b8bb418cc9a18b5c0bd5504fa11e6afe9455bc6d5 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.124319 4715 scope.go:117] "RemoveContainer" containerID="80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.124672 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff"} err="failed to get container status \"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\": rpc error: code = NotFound desc = could not find container \"80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff\": container with ID starting with 80476c6b7b054ffcafc531f1a4ef4fc0c9fcd5626b417a753ae9c9558e750cff not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.124724 4715 scope.go:117] "RemoveContainer" containerID="b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.125058 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4"} err="failed to get container status \"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\": rpc error: code = NotFound desc = could not find container \"b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4\": container with ID starting with b3d4f61fd88aeff79864a3ff4ee838e3d0b9c99944204733de6d5382b35d0ba4 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.125093 4715 scope.go:117] "RemoveContainer" containerID="76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.125402 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b"} err="failed to get container status \"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\": rpc error: code = NotFound desc = could not find container \"76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b\": container with ID starting with 76b0502cbe64ff8445997dec0790d9f2e2184b0a844e63a01a9e2570ad79e79b not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.125472 4715 scope.go:117] "RemoveContainer" containerID="1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.125797 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb"} err="failed to get container status \"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\": rpc error: code = NotFound desc = could not find container \"1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb\": container with ID starting with 1f42e50a213cd0d55d3ede97dcf5103203a9070939781c59123276f7ca4f66eb not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.125855 4715 scope.go:117] "RemoveContainer" containerID="e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.126161 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab"} err="failed to get container status \"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\": rpc error: code = NotFound desc = could not find container \"e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab\": container with ID starting with e1b2a9d563b51fe1fa5dfa97c4e4800a9c7d21f5d7552cee872fd105c45357ab not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.126204 4715 scope.go:117] "RemoveContainer" containerID="ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.126544 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3"} err="failed to get container status \"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\": rpc error: code = NotFound desc = could not find container \"ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3\": container with ID starting with ddefe0c66097daf0cacf84cfc9a8fe00bf23fbd280760dd8c1b8a2f7ffa702a3 not found: ID does not exist" Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.851410 4715 generic.go:334] "Generic (PLEG): container finished" podID="37b34064-7c0e-43b5-8b74-023ac1da933d" containerID="87a269c83de2e8f7af4203b255cb79f7520546f7f71cab688fd12c743a4ee1ac" exitCode=0 Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.851487 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" event={"ID":"37b34064-7c0e-43b5-8b74-023ac1da933d","Type":"ContainerDied","Data":"87a269c83de2e8f7af4203b255cb79f7520546f7f71cab688fd12c743a4ee1ac"} Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.851848 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" event={"ID":"37b34064-7c0e-43b5-8b74-023ac1da933d","Type":"ContainerStarted","Data":"821956857f38e6cb7bd889f6db41fbd327248ab8a6bada36979a7728a8ccfe04"} Oct 09 07:56:43 crc kubenswrapper[4715]: I1009 07:56:43.856164 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6vp75_6e61f2cb-cd6d-46d6-bbb6-dd99919b893d/kube-multus/2.log" Oct 09 07:56:44 crc kubenswrapper[4715]: I1009 07:56:44.146128 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d6cb14a-7329-4a80-aff2-acd9142558d3" path="/var/lib/kubelet/pods/1d6cb14a-7329-4a80-aff2-acd9142558d3/volumes" Oct 09 07:56:44 crc kubenswrapper[4715]: I1009 07:56:44.865818 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" event={"ID":"37b34064-7c0e-43b5-8b74-023ac1da933d","Type":"ContainerStarted","Data":"3428e95875fbdc942972c236dc0c4828f948206ea2cd0722f82a39936844e2d4"} Oct 09 07:56:44 crc kubenswrapper[4715]: I1009 07:56:44.866574 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" event={"ID":"37b34064-7c0e-43b5-8b74-023ac1da933d","Type":"ContainerStarted","Data":"6de616aac593b74b3fd291b78e131a737309974cc91b6ff44c07206c4fc3f08b"} Oct 09 07:56:44 crc kubenswrapper[4715]: I1009 07:56:44.866599 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" event={"ID":"37b34064-7c0e-43b5-8b74-023ac1da933d","Type":"ContainerStarted","Data":"0434b89221f51ef5ffe7b73d9f42185e847d41c1353179959b54f65cb27a60d1"} Oct 09 07:56:44 crc kubenswrapper[4715]: I1009 07:56:44.866608 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" event={"ID":"37b34064-7c0e-43b5-8b74-023ac1da933d","Type":"ContainerStarted","Data":"b29d0bacab38b43892f60b542fcfbe783e790d32cd0f92cec797bfa81a8bf740"} Oct 09 07:56:44 crc kubenswrapper[4715]: I1009 07:56:44.866616 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" event={"ID":"37b34064-7c0e-43b5-8b74-023ac1da933d","Type":"ContainerStarted","Data":"a7359aafee03aa29d48acd027617933ec71ae5b2903481aeaa447759fc10b4c2"} Oct 09 07:56:44 crc kubenswrapper[4715]: I1009 07:56:44.866623 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" event={"ID":"37b34064-7c0e-43b5-8b74-023ac1da933d","Type":"ContainerStarted","Data":"652b657c03aae05449b6185ce7a38a33a8c79642c9b0811ddba981bcd8403910"} Oct 09 07:56:46 crc kubenswrapper[4715]: I1009 07:56:46.754270 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 07:56:46 crc kubenswrapper[4715]: I1009 07:56:46.754358 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 07:56:46 crc kubenswrapper[4715]: I1009 07:56:46.754436 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:56:46 crc kubenswrapper[4715]: I1009 07:56:46.755092 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6cfe3d63903269fab164da1154df39ba0aa750858dad3414bb1690252e4ef7d"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 07:56:46 crc kubenswrapper[4715]: I1009 07:56:46.755155 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://a6cfe3d63903269fab164da1154df39ba0aa750858dad3414bb1690252e4ef7d" gracePeriod=600 Oct 09 07:56:46 crc kubenswrapper[4715]: I1009 07:56:46.883089 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" event={"ID":"37b34064-7c0e-43b5-8b74-023ac1da933d","Type":"ContainerStarted","Data":"1c9539b91b6417b51596a2bc8c5d54cf5c063aafaab45bbe39c13b8dd4a209bb"} Oct 09 07:56:47 crc kubenswrapper[4715]: I1009 07:56:47.894135 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="a6cfe3d63903269fab164da1154df39ba0aa750858dad3414bb1690252e4ef7d" exitCode=0 Oct 09 07:56:47 crc kubenswrapper[4715]: I1009 07:56:47.894250 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"a6cfe3d63903269fab164da1154df39ba0aa750858dad3414bb1690252e4ef7d"} Oct 09 07:56:47 crc kubenswrapper[4715]: I1009 07:56:47.894700 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"451f2195b7f62aab0dad39fca7efb143fc4db9dbd4af35f5a099bbb88635e621"} Oct 09 07:56:47 crc kubenswrapper[4715]: I1009 07:56:47.894735 4715 scope.go:117] "RemoveContainer" containerID="b010400ee7dba57a3343bec5cd3be68030f4519bc1714b489d54ec14a33cc803" Oct 09 07:56:49 crc kubenswrapper[4715]: I1009 07:56:49.911518 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" event={"ID":"37b34064-7c0e-43b5-8b74-023ac1da933d","Type":"ContainerStarted","Data":"a2a54652adab1a91b25fb87b56b84cf3c14f74a8cba3f8d235261d96bf280106"} Oct 09 07:56:49 crc kubenswrapper[4715]: I1009 07:56:49.911979 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:49 crc kubenswrapper[4715]: I1009 07:56:49.911989 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:49 crc kubenswrapper[4715]: I1009 07:56:49.911997 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:49 crc kubenswrapper[4715]: I1009 07:56:49.942468 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" podStartSLOduration=7.942447618 podStartE2EDuration="7.942447618s" podCreationTimestamp="2025-10-09 07:56:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:56:49.938175054 +0000 UTC m=+640.630979062" watchObservedRunningTime="2025-10-09 07:56:49.942447618 +0000 UTC m=+640.635251626" Oct 09 07:56:49 crc kubenswrapper[4715]: I1009 07:56:49.944397 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:49 crc kubenswrapper[4715]: I1009 07:56:49.946788 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:56:58 crc kubenswrapper[4715]: I1009 07:56:58.136815 4715 scope.go:117] "RemoveContainer" containerID="e5bd879138998ee05f837dc613e67592e7abe570a8c55d71b8fa18446d82d746" Oct 09 07:56:58 crc kubenswrapper[4715]: E1009 07:56:58.137673 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6vp75_openshift-multus(6e61f2cb-cd6d-46d6-bbb6-dd99919b893d)\"" pod="openshift-multus/multus-6vp75" podUID="6e61f2cb-cd6d-46d6-bbb6-dd99919b893d" Oct 09 07:57:10 crc kubenswrapper[4715]: I1009 07:57:10.141758 4715 scope.go:117] "RemoveContainer" containerID="e5bd879138998ee05f837dc613e67592e7abe570a8c55d71b8fa18446d82d746" Oct 09 07:57:11 crc kubenswrapper[4715]: I1009 07:57:11.047995 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6vp75_6e61f2cb-cd6d-46d6-bbb6-dd99919b893d/kube-multus/2.log" Oct 09 07:57:11 crc kubenswrapper[4715]: I1009 07:57:11.048604 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6vp75" event={"ID":"6e61f2cb-cd6d-46d6-bbb6-dd99919b893d","Type":"ContainerStarted","Data":"84e7ab93c446fb80675c595a0007c3459021f09a3a5f7aac5bd6f7f3eff4e2f1"} Oct 09 07:57:12 crc kubenswrapper[4715]: I1009 07:57:12.864212 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zzlss" Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.526289 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf"] Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.528166 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.530611 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.535505 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf"] Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.573393 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44e9a431-3bec-4439-9df7-a7f12d65dad2-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf\" (UID: \"44e9a431-3bec-4439-9df7-a7f12d65dad2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.573483 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49xbm\" (UniqueName: \"kubernetes.io/projected/44e9a431-3bec-4439-9df7-a7f12d65dad2-kube-api-access-49xbm\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf\" (UID: \"44e9a431-3bec-4439-9df7-a7f12d65dad2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.573520 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44e9a431-3bec-4439-9df7-a7f12d65dad2-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf\" (UID: \"44e9a431-3bec-4439-9df7-a7f12d65dad2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.674136 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44e9a431-3bec-4439-9df7-a7f12d65dad2-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf\" (UID: \"44e9a431-3bec-4439-9df7-a7f12d65dad2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.674270 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49xbm\" (UniqueName: \"kubernetes.io/projected/44e9a431-3bec-4439-9df7-a7f12d65dad2-kube-api-access-49xbm\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf\" (UID: \"44e9a431-3bec-4439-9df7-a7f12d65dad2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.674332 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44e9a431-3bec-4439-9df7-a7f12d65dad2-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf\" (UID: \"44e9a431-3bec-4439-9df7-a7f12d65dad2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.675132 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44e9a431-3bec-4439-9df7-a7f12d65dad2-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf\" (UID: \"44e9a431-3bec-4439-9df7-a7f12d65dad2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.675146 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44e9a431-3bec-4439-9df7-a7f12d65dad2-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf\" (UID: \"44e9a431-3bec-4439-9df7-a7f12d65dad2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.696152 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49xbm\" (UniqueName: \"kubernetes.io/projected/44e9a431-3bec-4439-9df7-a7f12d65dad2-kube-api-access-49xbm\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf\" (UID: \"44e9a431-3bec-4439-9df7-a7f12d65dad2\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:20 crc kubenswrapper[4715]: I1009 07:57:20.851087 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:21 crc kubenswrapper[4715]: I1009 07:57:21.248091 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf"] Oct 09 07:57:22 crc kubenswrapper[4715]: I1009 07:57:22.115468 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" event={"ID":"44e9a431-3bec-4439-9df7-a7f12d65dad2","Type":"ContainerStarted","Data":"8c17b0160574c721d79e428391559560bdc7041946e3c46fc37992316a141e28"} Oct 09 07:57:22 crc kubenswrapper[4715]: I1009 07:57:22.115516 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" event={"ID":"44e9a431-3bec-4439-9df7-a7f12d65dad2","Type":"ContainerStarted","Data":"ba1ba65d14311a929ee999b4efd12784715bdb17ecb197df3e3b6b7f0c87758b"} Oct 09 07:57:23 crc kubenswrapper[4715]: I1009 07:57:23.124975 4715 generic.go:334] "Generic (PLEG): container finished" podID="44e9a431-3bec-4439-9df7-a7f12d65dad2" containerID="8c17b0160574c721d79e428391559560bdc7041946e3c46fc37992316a141e28" exitCode=0 Oct 09 07:57:23 crc kubenswrapper[4715]: I1009 07:57:23.125050 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" event={"ID":"44e9a431-3bec-4439-9df7-a7f12d65dad2","Type":"ContainerDied","Data":"8c17b0160574c721d79e428391559560bdc7041946e3c46fc37992316a141e28"} Oct 09 07:57:25 crc kubenswrapper[4715]: I1009 07:57:25.138179 4715 generic.go:334] "Generic (PLEG): container finished" podID="44e9a431-3bec-4439-9df7-a7f12d65dad2" containerID="02833f7176e033672e8e169f9b16e3daad66b7c7fa31d6db49c8399afa21db10" exitCode=0 Oct 09 07:57:25 crc kubenswrapper[4715]: I1009 07:57:25.138488 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" event={"ID":"44e9a431-3bec-4439-9df7-a7f12d65dad2","Type":"ContainerDied","Data":"02833f7176e033672e8e169f9b16e3daad66b7c7fa31d6db49c8399afa21db10"} Oct 09 07:57:26 crc kubenswrapper[4715]: I1009 07:57:26.149060 4715 generic.go:334] "Generic (PLEG): container finished" podID="44e9a431-3bec-4439-9df7-a7f12d65dad2" containerID="7e39eeedf65f7feb963aba84c4df33d1a8fdfd76ed50965d8caf59adc29314c4" exitCode=0 Oct 09 07:57:26 crc kubenswrapper[4715]: I1009 07:57:26.150288 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" event={"ID":"44e9a431-3bec-4439-9df7-a7f12d65dad2","Type":"ContainerDied","Data":"7e39eeedf65f7feb963aba84c4df33d1a8fdfd76ed50965d8caf59adc29314c4"} Oct 09 07:57:27 crc kubenswrapper[4715]: I1009 07:57:27.443930 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:27 crc kubenswrapper[4715]: I1009 07:57:27.465296 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44e9a431-3bec-4439-9df7-a7f12d65dad2-util\") pod \"44e9a431-3bec-4439-9df7-a7f12d65dad2\" (UID: \"44e9a431-3bec-4439-9df7-a7f12d65dad2\") " Oct 09 07:57:27 crc kubenswrapper[4715]: I1009 07:57:27.465345 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49xbm\" (UniqueName: \"kubernetes.io/projected/44e9a431-3bec-4439-9df7-a7f12d65dad2-kube-api-access-49xbm\") pod \"44e9a431-3bec-4439-9df7-a7f12d65dad2\" (UID: \"44e9a431-3bec-4439-9df7-a7f12d65dad2\") " Oct 09 07:57:27 crc kubenswrapper[4715]: I1009 07:57:27.465397 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44e9a431-3bec-4439-9df7-a7f12d65dad2-bundle\") pod \"44e9a431-3bec-4439-9df7-a7f12d65dad2\" (UID: \"44e9a431-3bec-4439-9df7-a7f12d65dad2\") " Oct 09 07:57:27 crc kubenswrapper[4715]: I1009 07:57:27.466574 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44e9a431-3bec-4439-9df7-a7f12d65dad2-bundle" (OuterVolumeSpecName: "bundle") pod "44e9a431-3bec-4439-9df7-a7f12d65dad2" (UID: "44e9a431-3bec-4439-9df7-a7f12d65dad2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:57:27 crc kubenswrapper[4715]: I1009 07:57:27.473472 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e9a431-3bec-4439-9df7-a7f12d65dad2-kube-api-access-49xbm" (OuterVolumeSpecName: "kube-api-access-49xbm") pod "44e9a431-3bec-4439-9df7-a7f12d65dad2" (UID: "44e9a431-3bec-4439-9df7-a7f12d65dad2"). InnerVolumeSpecName "kube-api-access-49xbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:57:27 crc kubenswrapper[4715]: I1009 07:57:27.481984 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44e9a431-3bec-4439-9df7-a7f12d65dad2-util" (OuterVolumeSpecName: "util") pod "44e9a431-3bec-4439-9df7-a7f12d65dad2" (UID: "44e9a431-3bec-4439-9df7-a7f12d65dad2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:57:27 crc kubenswrapper[4715]: I1009 07:57:27.566909 4715 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44e9a431-3bec-4439-9df7-a7f12d65dad2-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:57:27 crc kubenswrapper[4715]: I1009 07:57:27.566957 4715 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44e9a431-3bec-4439-9df7-a7f12d65dad2-util\") on node \"crc\" DevicePath \"\"" Oct 09 07:57:27 crc kubenswrapper[4715]: I1009 07:57:27.566972 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49xbm\" (UniqueName: \"kubernetes.io/projected/44e9a431-3bec-4439-9df7-a7f12d65dad2-kube-api-access-49xbm\") on node \"crc\" DevicePath \"\"" Oct 09 07:57:28 crc kubenswrapper[4715]: I1009 07:57:28.161108 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" event={"ID":"44e9a431-3bec-4439-9df7-a7f12d65dad2","Type":"ContainerDied","Data":"ba1ba65d14311a929ee999b4efd12784715bdb17ecb197df3e3b6b7f0c87758b"} Oct 09 07:57:28 crc kubenswrapper[4715]: I1009 07:57:28.161555 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba1ba65d14311a929ee999b4efd12784715bdb17ecb197df3e3b6b7f0c87758b" Oct 09 07:57:28 crc kubenswrapper[4715]: I1009 07:57:28.161166 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf" Oct 09 07:57:31 crc kubenswrapper[4715]: I1009 07:57:31.903611 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-g94hr"] Oct 09 07:57:31 crc kubenswrapper[4715]: E1009 07:57:31.904264 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e9a431-3bec-4439-9df7-a7f12d65dad2" containerName="pull" Oct 09 07:57:31 crc kubenswrapper[4715]: I1009 07:57:31.904281 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e9a431-3bec-4439-9df7-a7f12d65dad2" containerName="pull" Oct 09 07:57:31 crc kubenswrapper[4715]: E1009 07:57:31.904298 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e9a431-3bec-4439-9df7-a7f12d65dad2" containerName="util" Oct 09 07:57:31 crc kubenswrapper[4715]: I1009 07:57:31.904306 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e9a431-3bec-4439-9df7-a7f12d65dad2" containerName="util" Oct 09 07:57:31 crc kubenswrapper[4715]: E1009 07:57:31.904318 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e9a431-3bec-4439-9df7-a7f12d65dad2" containerName="extract" Oct 09 07:57:31 crc kubenswrapper[4715]: I1009 07:57:31.904329 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e9a431-3bec-4439-9df7-a7f12d65dad2" containerName="extract" Oct 09 07:57:31 crc kubenswrapper[4715]: I1009 07:57:31.904472 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="44e9a431-3bec-4439-9df7-a7f12d65dad2" containerName="extract" Oct 09 07:57:31 crc kubenswrapper[4715]: I1009 07:57:31.904979 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-g94hr" Oct 09 07:57:31 crc kubenswrapper[4715]: I1009 07:57:31.907279 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 09 07:57:31 crc kubenswrapper[4715]: I1009 07:57:31.907464 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-m9n45" Oct 09 07:57:31 crc kubenswrapper[4715]: I1009 07:57:31.909011 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 09 07:57:31 crc kubenswrapper[4715]: I1009 07:57:31.917247 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-g94hr"] Oct 09 07:57:31 crc kubenswrapper[4715]: I1009 07:57:31.931966 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bmvm\" (UniqueName: \"kubernetes.io/projected/c66a28bc-b6c9-426c-bc4d-b8748836b175-kube-api-access-8bmvm\") pod \"nmstate-operator-858ddd8f98-g94hr\" (UID: \"c66a28bc-b6c9-426c-bc4d-b8748836b175\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-g94hr" Oct 09 07:57:32 crc kubenswrapper[4715]: I1009 07:57:32.032666 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bmvm\" (UniqueName: \"kubernetes.io/projected/c66a28bc-b6c9-426c-bc4d-b8748836b175-kube-api-access-8bmvm\") pod \"nmstate-operator-858ddd8f98-g94hr\" (UID: \"c66a28bc-b6c9-426c-bc4d-b8748836b175\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-g94hr" Oct 09 07:57:32 crc kubenswrapper[4715]: I1009 07:57:32.048257 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bmvm\" (UniqueName: \"kubernetes.io/projected/c66a28bc-b6c9-426c-bc4d-b8748836b175-kube-api-access-8bmvm\") pod \"nmstate-operator-858ddd8f98-g94hr\" (UID: \"c66a28bc-b6c9-426c-bc4d-b8748836b175\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-g94hr" Oct 09 07:57:32 crc kubenswrapper[4715]: I1009 07:57:32.220440 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-g94hr" Oct 09 07:57:32 crc kubenswrapper[4715]: I1009 07:57:32.422558 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-g94hr"] Oct 09 07:57:33 crc kubenswrapper[4715]: I1009 07:57:33.191458 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-g94hr" event={"ID":"c66a28bc-b6c9-426c-bc4d-b8748836b175","Type":"ContainerStarted","Data":"07ba032ac273ddb9f6a6dd56e61f3cd870ac65731ec16761bc42b614f99cd6a6"} Oct 09 07:57:35 crc kubenswrapper[4715]: I1009 07:57:35.202665 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-g94hr" event={"ID":"c66a28bc-b6c9-426c-bc4d-b8748836b175","Type":"ContainerStarted","Data":"230ce588b0a304979728d31592b7b100d6d24e2925dc28df293cc0674da2009e"} Oct 09 07:57:35 crc kubenswrapper[4715]: I1009 07:57:35.219668 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-g94hr" podStartSLOduration=2.030245302 podStartE2EDuration="4.219645196s" podCreationTimestamp="2025-10-09 07:57:31 +0000 UTC" firstStartedPulling="2025-10-09 07:57:32.4318661 +0000 UTC m=+683.124670108" lastFinishedPulling="2025-10-09 07:57:34.621265994 +0000 UTC m=+685.314070002" observedRunningTime="2025-10-09 07:57:35.217231875 +0000 UTC m=+685.910035913" watchObservedRunningTime="2025-10-09 07:57:35.219645196 +0000 UTC m=+685.912449204" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.269389 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-cvrwm"] Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.271196 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cvrwm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.278715 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-hvwgz" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.279911 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw"] Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.281031 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.282787 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.287953 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-cvrwm"] Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.292495 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-x6h8j"] Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.293620 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.297841 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw"] Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.367118 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6q4n\" (UniqueName: \"kubernetes.io/projected/f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6-kube-api-access-q6q4n\") pod \"nmstate-webhook-6cdbc54649-bk7lw\" (UID: \"f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.367202 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8ghx\" (UniqueName: \"kubernetes.io/projected/4d966a8c-9962-4542-9e72-fbd4959508e6-kube-api-access-c8ghx\") pod \"nmstate-handler-x6h8j\" (UID: \"4d966a8c-9962-4542-9e72-fbd4959508e6\") " pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.367277 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnjlz\" (UniqueName: \"kubernetes.io/projected/264aef9d-e55d-41e9-b2e4-055db900d371-kube-api-access-wnjlz\") pod \"nmstate-metrics-fdff9cb8d-cvrwm\" (UID: \"264aef9d-e55d-41e9-b2e4-055db900d371\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cvrwm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.367347 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4d966a8c-9962-4542-9e72-fbd4959508e6-ovs-socket\") pod \"nmstate-handler-x6h8j\" (UID: \"4d966a8c-9962-4542-9e72-fbd4959508e6\") " pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.367370 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bk7lw\" (UID: \"f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.367404 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4d966a8c-9962-4542-9e72-fbd4959508e6-nmstate-lock\") pod \"nmstate-handler-x6h8j\" (UID: \"4d966a8c-9962-4542-9e72-fbd4959508e6\") " pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.367454 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4d966a8c-9962-4542-9e72-fbd4959508e6-dbus-socket\") pod \"nmstate-handler-x6h8j\" (UID: \"4d966a8c-9962-4542-9e72-fbd4959508e6\") " pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.408563 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg"] Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.409250 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" Oct 09 07:57:41 crc kubenswrapper[4715]: W1009 07:57:41.412060 4715 reflector.go:561] object-"openshift-nmstate"/"default-dockercfg-v2lhq": failed to list *v1.Secret: secrets "default-dockercfg-v2lhq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Oct 09 07:57:41 crc kubenswrapper[4715]: E1009 07:57:41.412117 4715 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"default-dockercfg-v2lhq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-v2lhq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 09 07:57:41 crc kubenswrapper[4715]: W1009 07:57:41.415268 4715 reflector.go:561] object-"openshift-nmstate"/"plugin-serving-cert": failed to list *v1.Secret: secrets "plugin-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Oct 09 07:57:41 crc kubenswrapper[4715]: E1009 07:57:41.415324 4715 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"plugin-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"plugin-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.415413 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.431110 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg"] Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.468737 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4d966a8c-9962-4542-9e72-fbd4959508e6-ovs-socket\") pod \"nmstate-handler-x6h8j\" (UID: \"4d966a8c-9962-4542-9e72-fbd4959508e6\") " pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.468784 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bk7lw\" (UID: \"f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.468821 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/02f7ed0f-b6c8-412c-b89a-a0a42d82a72d-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-4pcfg\" (UID: \"02f7ed0f-b6c8-412c-b89a-a0a42d82a72d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.468840 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4d966a8c-9962-4542-9e72-fbd4959508e6-nmstate-lock\") pod \"nmstate-handler-x6h8j\" (UID: \"4d966a8c-9962-4542-9e72-fbd4959508e6\") " pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.468862 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8kzm\" (UniqueName: \"kubernetes.io/projected/02f7ed0f-b6c8-412c-b89a-a0a42d82a72d-kube-api-access-r8kzm\") pod \"nmstate-console-plugin-6b874cbd85-4pcfg\" (UID: \"02f7ed0f-b6c8-412c-b89a-a0a42d82a72d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.468881 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4d966a8c-9962-4542-9e72-fbd4959508e6-dbus-socket\") pod \"nmstate-handler-x6h8j\" (UID: \"4d966a8c-9962-4542-9e72-fbd4959508e6\") " pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.468911 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/02f7ed0f-b6c8-412c-b89a-a0a42d82a72d-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-4pcfg\" (UID: \"02f7ed0f-b6c8-412c-b89a-a0a42d82a72d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.468912 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4d966a8c-9962-4542-9e72-fbd4959508e6-ovs-socket\") pod \"nmstate-handler-x6h8j\" (UID: \"4d966a8c-9962-4542-9e72-fbd4959508e6\") " pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.468946 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4d966a8c-9962-4542-9e72-fbd4959508e6-nmstate-lock\") pod \"nmstate-handler-x6h8j\" (UID: \"4d966a8c-9962-4542-9e72-fbd4959508e6\") " pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.468956 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6q4n\" (UniqueName: \"kubernetes.io/projected/f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6-kube-api-access-q6q4n\") pod \"nmstate-webhook-6cdbc54649-bk7lw\" (UID: \"f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.469021 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8ghx\" (UniqueName: \"kubernetes.io/projected/4d966a8c-9962-4542-9e72-fbd4959508e6-kube-api-access-c8ghx\") pod \"nmstate-handler-x6h8j\" (UID: \"4d966a8c-9962-4542-9e72-fbd4959508e6\") " pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.469046 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnjlz\" (UniqueName: \"kubernetes.io/projected/264aef9d-e55d-41e9-b2e4-055db900d371-kube-api-access-wnjlz\") pod \"nmstate-metrics-fdff9cb8d-cvrwm\" (UID: \"264aef9d-e55d-41e9-b2e4-055db900d371\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cvrwm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.469152 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4d966a8c-9962-4542-9e72-fbd4959508e6-dbus-socket\") pod \"nmstate-handler-x6h8j\" (UID: \"4d966a8c-9962-4542-9e72-fbd4959508e6\") " pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.475012 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bk7lw\" (UID: \"f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.485272 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8ghx\" (UniqueName: \"kubernetes.io/projected/4d966a8c-9962-4542-9e72-fbd4959508e6-kube-api-access-c8ghx\") pod \"nmstate-handler-x6h8j\" (UID: \"4d966a8c-9962-4542-9e72-fbd4959508e6\") " pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.485872 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6q4n\" (UniqueName: \"kubernetes.io/projected/f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6-kube-api-access-q6q4n\") pod \"nmstate-webhook-6cdbc54649-bk7lw\" (UID: \"f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.486176 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnjlz\" (UniqueName: \"kubernetes.io/projected/264aef9d-e55d-41e9-b2e4-055db900d371-kube-api-access-wnjlz\") pod \"nmstate-metrics-fdff9cb8d-cvrwm\" (UID: \"264aef9d-e55d-41e9-b2e4-055db900d371\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cvrwm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.569778 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8kzm\" (UniqueName: \"kubernetes.io/projected/02f7ed0f-b6c8-412c-b89a-a0a42d82a72d-kube-api-access-r8kzm\") pod \"nmstate-console-plugin-6b874cbd85-4pcfg\" (UID: \"02f7ed0f-b6c8-412c-b89a-a0a42d82a72d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.569835 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/02f7ed0f-b6c8-412c-b89a-a0a42d82a72d-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-4pcfg\" (UID: \"02f7ed0f-b6c8-412c-b89a-a0a42d82a72d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.569908 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/02f7ed0f-b6c8-412c-b89a-a0a42d82a72d-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-4pcfg\" (UID: \"02f7ed0f-b6c8-412c-b89a-a0a42d82a72d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.571347 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/02f7ed0f-b6c8-412c-b89a-a0a42d82a72d-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-4pcfg\" (UID: \"02f7ed0f-b6c8-412c-b89a-a0a42d82a72d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.589479 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8kzm\" (UniqueName: \"kubernetes.io/projected/02f7ed0f-b6c8-412c-b89a-a0a42d82a72d-kube-api-access-r8kzm\") pod \"nmstate-console-plugin-6b874cbd85-4pcfg\" (UID: \"02f7ed0f-b6c8-412c-b89a-a0a42d82a72d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.593132 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cvrwm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.612549 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f555b5bf4-kvrzm"] Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.613590 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.615475 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.629485 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.638354 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f555b5bf4-kvrzm"] Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.670259 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-console-oauth-config\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.670335 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-service-ca\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.670365 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-console-config\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.670762 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t6hm\" (UniqueName: \"kubernetes.io/projected/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-kube-api-access-7t6hm\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.670820 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-console-serving-cert\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.670867 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-trusted-ca-bundle\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.670952 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-oauth-serving-cert\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.772560 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-console-config\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.772634 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t6hm\" (UniqueName: \"kubernetes.io/projected/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-kube-api-access-7t6hm\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.772671 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-console-serving-cert\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.772686 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-trusted-ca-bundle\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.772708 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-oauth-serving-cert\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.772735 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-console-oauth-config\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.772765 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-service-ca\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.775465 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-service-ca\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.775565 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-console-config\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.775691 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-oauth-serving-cert\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.777251 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-console-serving-cert\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.778356 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-console-oauth-config\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.781369 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-trusted-ca-bundle\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.789207 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t6hm\" (UniqueName: \"kubernetes.io/projected/876cdc2a-16cb-4619-a57a-7b8c1a818e7d-kube-api-access-7t6hm\") pod \"console-6f555b5bf4-kvrzm\" (UID: \"876cdc2a-16cb-4619-a57a-7b8c1a818e7d\") " pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.874710 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw"] Oct 09 07:57:41 crc kubenswrapper[4715]: I1009 07:57:41.971021 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:42 crc kubenswrapper[4715]: I1009 07:57:42.018815 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-cvrwm"] Oct 09 07:57:42 crc kubenswrapper[4715]: W1009 07:57:42.030217 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod264aef9d_e55d_41e9_b2e4_055db900d371.slice/crio-1d21e9aace229ecb4a47e18104277b00e0463422b69c6132268930961e3c6f18 WatchSource:0}: Error finding container 1d21e9aace229ecb4a47e18104277b00e0463422b69c6132268930961e3c6f18: Status 404 returned error can't find the container with id 1d21e9aace229ecb4a47e18104277b00e0463422b69c6132268930961e3c6f18 Oct 09 07:57:42 crc kubenswrapper[4715]: I1009 07:57:42.162273 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f555b5bf4-kvrzm"] Oct 09 07:57:42 crc kubenswrapper[4715]: W1009 07:57:42.168180 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod876cdc2a_16cb_4619_a57a_7b8c1a818e7d.slice/crio-93237a7e3cbb2ee845f0a4ebb7b30310b99f64ab41f974c4a6f14aa516f1d15e WatchSource:0}: Error finding container 93237a7e3cbb2ee845f0a4ebb7b30310b99f64ab41f974c4a6f14aa516f1d15e: Status 404 returned error can't find the container with id 93237a7e3cbb2ee845f0a4ebb7b30310b99f64ab41f974c4a6f14aa516f1d15e Oct 09 07:57:42 crc kubenswrapper[4715]: I1009 07:57:42.245193 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" event={"ID":"f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6","Type":"ContainerStarted","Data":"e8000344b60ecffe71a532e49872d31ad661347f99b8c73b1a5338d4a83bd883"} Oct 09 07:57:42 crc kubenswrapper[4715]: I1009 07:57:42.246059 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f555b5bf4-kvrzm" event={"ID":"876cdc2a-16cb-4619-a57a-7b8c1a818e7d","Type":"ContainerStarted","Data":"93237a7e3cbb2ee845f0a4ebb7b30310b99f64ab41f974c4a6f14aa516f1d15e"} Oct 09 07:57:42 crc kubenswrapper[4715]: I1009 07:57:42.246940 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-x6h8j" event={"ID":"4d966a8c-9962-4542-9e72-fbd4959508e6","Type":"ContainerStarted","Data":"7b77390a18f92d9347ced2b6acd60666238c6edb0e3283c490693fd2fe3cf092"} Oct 09 07:57:42 crc kubenswrapper[4715]: I1009 07:57:42.247785 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cvrwm" event={"ID":"264aef9d-e55d-41e9-b2e4-055db900d371","Type":"ContainerStarted","Data":"1d21e9aace229ecb4a47e18104277b00e0463422b69c6132268930961e3c6f18"} Oct 09 07:57:42 crc kubenswrapper[4715]: I1009 07:57:42.439489 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 09 07:57:42 crc kubenswrapper[4715]: I1009 07:57:42.445342 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/02f7ed0f-b6c8-412c-b89a-a0a42d82a72d-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-4pcfg\" (UID: \"02f7ed0f-b6c8-412c-b89a-a0a42d82a72d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" Oct 09 07:57:42 crc kubenswrapper[4715]: I1009 07:57:42.626629 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-v2lhq" Oct 09 07:57:42 crc kubenswrapper[4715]: I1009 07:57:42.631948 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" Oct 09 07:57:42 crc kubenswrapper[4715]: I1009 07:57:42.814988 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg"] Oct 09 07:57:42 crc kubenswrapper[4715]: W1009 07:57:42.822578 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02f7ed0f_b6c8_412c_b89a_a0a42d82a72d.slice/crio-143789ab1ccb838181a4b18f62716832a27bdd79cb7dd9c8c2d049b35ea5b9c2 WatchSource:0}: Error finding container 143789ab1ccb838181a4b18f62716832a27bdd79cb7dd9c8c2d049b35ea5b9c2: Status 404 returned error can't find the container with id 143789ab1ccb838181a4b18f62716832a27bdd79cb7dd9c8c2d049b35ea5b9c2 Oct 09 07:57:43 crc kubenswrapper[4715]: I1009 07:57:43.255093 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f555b5bf4-kvrzm" event={"ID":"876cdc2a-16cb-4619-a57a-7b8c1a818e7d","Type":"ContainerStarted","Data":"aa6f8b18697e35dd7155dc972d54bf703990f3955ffcdb8e08abb35d2cc51cea"} Oct 09 07:57:43 crc kubenswrapper[4715]: I1009 07:57:43.256279 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" event={"ID":"02f7ed0f-b6c8-412c-b89a-a0a42d82a72d","Type":"ContainerStarted","Data":"143789ab1ccb838181a4b18f62716832a27bdd79cb7dd9c8c2d049b35ea5b9c2"} Oct 09 07:57:43 crc kubenswrapper[4715]: I1009 07:57:43.274856 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f555b5bf4-kvrzm" podStartSLOduration=2.274826493 podStartE2EDuration="2.274826493s" podCreationTimestamp="2025-10-09 07:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:57:43.271146716 +0000 UTC m=+693.963950734" watchObservedRunningTime="2025-10-09 07:57:43.274826493 +0000 UTC m=+693.967630501" Oct 09 07:57:45 crc kubenswrapper[4715]: I1009 07:57:45.269201 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cvrwm" event={"ID":"264aef9d-e55d-41e9-b2e4-055db900d371","Type":"ContainerStarted","Data":"93a6348ad1c9feac52c899b37d1d2d3a5a95b32cde9caa8686ade453b2266369"} Oct 09 07:57:45 crc kubenswrapper[4715]: I1009 07:57:45.270863 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" event={"ID":"f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6","Type":"ContainerStarted","Data":"2403107c80bc86d0a60aa08bd49bd540cd9e30df673d4ab3562d1c9ece4d7d88"} Oct 09 07:57:45 crc kubenswrapper[4715]: I1009 07:57:45.271021 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" Oct 09 07:57:45 crc kubenswrapper[4715]: I1009 07:57:45.272189 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-x6h8j" event={"ID":"4d966a8c-9962-4542-9e72-fbd4959508e6","Type":"ContainerStarted","Data":"dffc3ece61b4e0868b9122f4c7b4032b15efc821f0ccdb7d3eede63869b7e305"} Oct 09 07:57:45 crc kubenswrapper[4715]: I1009 07:57:45.272365 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:45 crc kubenswrapper[4715]: I1009 07:57:45.291860 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" podStartSLOduration=1.377342998 podStartE2EDuration="4.291832958s" podCreationTimestamp="2025-10-09 07:57:41 +0000 UTC" firstStartedPulling="2025-10-09 07:57:41.92927733 +0000 UTC m=+692.622081338" lastFinishedPulling="2025-10-09 07:57:44.84376729 +0000 UTC m=+695.536571298" observedRunningTime="2025-10-09 07:57:45.288224043 +0000 UTC m=+695.981028051" watchObservedRunningTime="2025-10-09 07:57:45.291832958 +0000 UTC m=+695.984636966" Oct 09 07:57:45 crc kubenswrapper[4715]: I1009 07:57:45.308910 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-x6h8j" podStartSLOduration=1.224421217 podStartE2EDuration="4.308888746s" podCreationTimestamp="2025-10-09 07:57:41 +0000 UTC" firstStartedPulling="2025-10-09 07:57:41.672077388 +0000 UTC m=+692.364881396" lastFinishedPulling="2025-10-09 07:57:44.756544917 +0000 UTC m=+695.449348925" observedRunningTime="2025-10-09 07:57:45.30561522 +0000 UTC m=+695.998419238" watchObservedRunningTime="2025-10-09 07:57:45.308888746 +0000 UTC m=+696.001692754" Oct 09 07:57:46 crc kubenswrapper[4715]: I1009 07:57:46.279577 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" event={"ID":"02f7ed0f-b6c8-412c-b89a-a0a42d82a72d","Type":"ContainerStarted","Data":"452bc927b5b05584f9fbc5d19fb2e5d41a3b16e47cf6f487bf91871d4aaf4e66"} Oct 09 07:57:46 crc kubenswrapper[4715]: I1009 07:57:46.296378 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-4pcfg" podStartSLOduration=2.420937415 podStartE2EDuration="5.296351736s" podCreationTimestamp="2025-10-09 07:57:41 +0000 UTC" firstStartedPulling="2025-10-09 07:57:42.825412306 +0000 UTC m=+693.518216304" lastFinishedPulling="2025-10-09 07:57:45.700826627 +0000 UTC m=+696.393630625" observedRunningTime="2025-10-09 07:57:46.29443309 +0000 UTC m=+696.987237098" watchObservedRunningTime="2025-10-09 07:57:46.296351736 +0000 UTC m=+696.989155744" Oct 09 07:57:47 crc kubenswrapper[4715]: I1009 07:57:47.284891 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cvrwm" event={"ID":"264aef9d-e55d-41e9-b2e4-055db900d371","Type":"ContainerStarted","Data":"086612b102454fa6867eb0ac39971be238a4c8b74c64dc27a300506b840ec7dc"} Oct 09 07:57:47 crc kubenswrapper[4715]: I1009 07:57:47.306663 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-cvrwm" podStartSLOduration=1.5041217759999999 podStartE2EDuration="6.306635471s" podCreationTimestamp="2025-10-09 07:57:41 +0000 UTC" firstStartedPulling="2025-10-09 07:57:42.032199152 +0000 UTC m=+692.725003160" lastFinishedPulling="2025-10-09 07:57:46.834712847 +0000 UTC m=+697.527516855" observedRunningTime="2025-10-09 07:57:47.306361443 +0000 UTC m=+697.999165461" watchObservedRunningTime="2025-10-09 07:57:47.306635471 +0000 UTC m=+697.999439479" Oct 09 07:57:51 crc kubenswrapper[4715]: I1009 07:57:51.653092 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-x6h8j" Oct 09 07:57:51 crc kubenswrapper[4715]: I1009 07:57:51.971860 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:51 crc kubenswrapper[4715]: I1009 07:57:51.972010 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:51 crc kubenswrapper[4715]: I1009 07:57:51.978918 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:52 crc kubenswrapper[4715]: I1009 07:57:52.318595 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f555b5bf4-kvrzm" Oct 09 07:57:52 crc kubenswrapper[4715]: I1009 07:57:52.386977 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5fdhg"] Oct 09 07:58:01 crc kubenswrapper[4715]: I1009 07:58:01.626962 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bk7lw" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.311587 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr"] Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.313276 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.315039 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.322961 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr"] Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.415463 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0cfb14d-8aa5-4841-9f91-dd632d372e18-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr\" (UID: \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.415569 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0cfb14d-8aa5-4841-9f91-dd632d372e18-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr\" (UID: \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.415785 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gzm2\" (UniqueName: \"kubernetes.io/projected/f0cfb14d-8aa5-4841-9f91-dd632d372e18-kube-api-access-7gzm2\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr\" (UID: \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.517322 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0cfb14d-8aa5-4841-9f91-dd632d372e18-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr\" (UID: \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.517517 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0cfb14d-8aa5-4841-9f91-dd632d372e18-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr\" (UID: \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.517631 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gzm2\" (UniqueName: \"kubernetes.io/projected/f0cfb14d-8aa5-4841-9f91-dd632d372e18-kube-api-access-7gzm2\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr\" (UID: \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.517842 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0cfb14d-8aa5-4841-9f91-dd632d372e18-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr\" (UID: \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.518025 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0cfb14d-8aa5-4841-9f91-dd632d372e18-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr\" (UID: \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.543643 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gzm2\" (UniqueName: \"kubernetes.io/projected/f0cfb14d-8aa5-4841-9f91-dd632d372e18-kube-api-access-7gzm2\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr\" (UID: \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.634391 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:15 crc kubenswrapper[4715]: I1009 07:58:15.924059 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr"] Oct 09 07:58:15 crc kubenswrapper[4715]: W1009 07:58:15.934671 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0cfb14d_8aa5_4841_9f91_dd632d372e18.slice/crio-66970d1f3350857bfc1c59c8b099376aa3fcd7d5655172a97edfb0ef99723b0e WatchSource:0}: Error finding container 66970d1f3350857bfc1c59c8b099376aa3fcd7d5655172a97edfb0ef99723b0e: Status 404 returned error can't find the container with id 66970d1f3350857bfc1c59c8b099376aa3fcd7d5655172a97edfb0ef99723b0e Oct 09 07:58:16 crc kubenswrapper[4715]: I1009 07:58:16.473126 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" event={"ID":"f0cfb14d-8aa5-4841-9f91-dd632d372e18","Type":"ContainerStarted","Data":"66970d1f3350857bfc1c59c8b099376aa3fcd7d5655172a97edfb0ef99723b0e"} Oct 09 07:58:17 crc kubenswrapper[4715]: I1009 07:58:17.439581 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-5fdhg" podUID="3c1c9983-60a8-4db2-866c-15deb7220cb9" containerName="console" containerID="cri-o://dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2" gracePeriod=15 Oct 09 07:58:17 crc kubenswrapper[4715]: I1009 07:58:17.492879 4715 generic.go:334] "Generic (PLEG): container finished" podID="f0cfb14d-8aa5-4841-9f91-dd632d372e18" containerID="1d61c6980d9edcd267c569caa9902a678b85d10708e76d95b6db95180834fba4" exitCode=0 Oct 09 07:58:17 crc kubenswrapper[4715]: I1009 07:58:17.493065 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" event={"ID":"f0cfb14d-8aa5-4841-9f91-dd632d372e18","Type":"ContainerDied","Data":"1d61c6980d9edcd267c569caa9902a678b85d10708e76d95b6db95180834fba4"} Oct 09 07:58:17 crc kubenswrapper[4715]: I1009 07:58:17.985640 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5fdhg_3c1c9983-60a8-4db2-866c-15deb7220cb9/console/0.log" Oct 09 07:58:17 crc kubenswrapper[4715]: I1009 07:58:17.985719 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.052825 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-oauth-serving-cert\") pod \"3c1c9983-60a8-4db2-866c-15deb7220cb9\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.052946 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-serving-cert\") pod \"3c1c9983-60a8-4db2-866c-15deb7220cb9\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.053008 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-service-ca\") pod \"3c1c9983-60a8-4db2-866c-15deb7220cb9\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.053052 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qm46\" (UniqueName: \"kubernetes.io/projected/3c1c9983-60a8-4db2-866c-15deb7220cb9-kube-api-access-6qm46\") pod \"3c1c9983-60a8-4db2-866c-15deb7220cb9\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.053130 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-trusted-ca-bundle\") pod \"3c1c9983-60a8-4db2-866c-15deb7220cb9\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.053173 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-oauth-config\") pod \"3c1c9983-60a8-4db2-866c-15deb7220cb9\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.053218 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-config\") pod \"3c1c9983-60a8-4db2-866c-15deb7220cb9\" (UID: \"3c1c9983-60a8-4db2-866c-15deb7220cb9\") " Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.054068 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-service-ca" (OuterVolumeSpecName: "service-ca") pod "3c1c9983-60a8-4db2-866c-15deb7220cb9" (UID: "3c1c9983-60a8-4db2-866c-15deb7220cb9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.054080 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-config" (OuterVolumeSpecName: "console-config") pod "3c1c9983-60a8-4db2-866c-15deb7220cb9" (UID: "3c1c9983-60a8-4db2-866c-15deb7220cb9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.054099 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3c1c9983-60a8-4db2-866c-15deb7220cb9" (UID: "3c1c9983-60a8-4db2-866c-15deb7220cb9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.054808 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3c1c9983-60a8-4db2-866c-15deb7220cb9" (UID: "3c1c9983-60a8-4db2-866c-15deb7220cb9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.059155 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c1c9983-60a8-4db2-866c-15deb7220cb9-kube-api-access-6qm46" (OuterVolumeSpecName: "kube-api-access-6qm46") pod "3c1c9983-60a8-4db2-866c-15deb7220cb9" (UID: "3c1c9983-60a8-4db2-866c-15deb7220cb9"). InnerVolumeSpecName "kube-api-access-6qm46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.059386 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3c1c9983-60a8-4db2-866c-15deb7220cb9" (UID: "3c1c9983-60a8-4db2-866c-15deb7220cb9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.059493 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3c1c9983-60a8-4db2-866c-15deb7220cb9" (UID: "3c1c9983-60a8-4db2-866c-15deb7220cb9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.154728 4715 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-service-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.154780 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qm46\" (UniqueName: \"kubernetes.io/projected/3c1c9983-60a8-4db2-866c-15deb7220cb9-kube-api-access-6qm46\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.154801 4715 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.154822 4715 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.154838 4715 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.154854 4715 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3c1c9983-60a8-4db2-866c-15deb7220cb9-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.154872 4715 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3c1c9983-60a8-4db2-866c-15deb7220cb9-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.501947 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-5fdhg_3c1c9983-60a8-4db2-866c-15deb7220cb9/console/0.log" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.502009 4715 generic.go:334] "Generic (PLEG): container finished" podID="3c1c9983-60a8-4db2-866c-15deb7220cb9" containerID="dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2" exitCode=2 Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.502073 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5fdhg" event={"ID":"3c1c9983-60a8-4db2-866c-15deb7220cb9","Type":"ContainerDied","Data":"dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2"} Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.502128 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-5fdhg" event={"ID":"3c1c9983-60a8-4db2-866c-15deb7220cb9","Type":"ContainerDied","Data":"7e1045475638b72714fbbd43e44560d678ccb4e0085a62d861bae5749886dc94"} Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.502150 4715 scope.go:117] "RemoveContainer" containerID="dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.502152 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-5fdhg" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.527736 4715 scope.go:117] "RemoveContainer" containerID="dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2" Oct 09 07:58:18 crc kubenswrapper[4715]: E1009 07:58:18.528957 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2\": container with ID starting with dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2 not found: ID does not exist" containerID="dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.529005 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2"} err="failed to get container status \"dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2\": rpc error: code = NotFound desc = could not find container \"dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2\": container with ID starting with dd4a1f0625d6ec57d533abbb7d850dfe0a384df33a2835188c4ce243c37068d2 not found: ID does not exist" Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.529851 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-5fdhg"] Oct 09 07:58:18 crc kubenswrapper[4715]: I1009 07:58:18.535231 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-5fdhg"] Oct 09 07:58:19 crc kubenswrapper[4715]: I1009 07:58:19.515010 4715 generic.go:334] "Generic (PLEG): container finished" podID="f0cfb14d-8aa5-4841-9f91-dd632d372e18" containerID="635458dac444b63b4ce33fd8ce599065658d878de475ddfacf4d3ab19c8958a6" exitCode=0 Oct 09 07:58:19 crc kubenswrapper[4715]: I1009 07:58:19.515077 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" event={"ID":"f0cfb14d-8aa5-4841-9f91-dd632d372e18","Type":"ContainerDied","Data":"635458dac444b63b4ce33fd8ce599065658d878de475ddfacf4d3ab19c8958a6"} Oct 09 07:58:20 crc kubenswrapper[4715]: I1009 07:58:20.145297 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c1c9983-60a8-4db2-866c-15deb7220cb9" path="/var/lib/kubelet/pods/3c1c9983-60a8-4db2-866c-15deb7220cb9/volumes" Oct 09 07:58:20 crc kubenswrapper[4715]: I1009 07:58:20.525639 4715 generic.go:334] "Generic (PLEG): container finished" podID="f0cfb14d-8aa5-4841-9f91-dd632d372e18" containerID="3884a7a3f2995eaccb4c75e2668f0c7ad7852980c02c0216fdb0c5cb99471d8a" exitCode=0 Oct 09 07:58:20 crc kubenswrapper[4715]: I1009 07:58:20.525718 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" event={"ID":"f0cfb14d-8aa5-4841-9f91-dd632d372e18","Type":"ContainerDied","Data":"3884a7a3f2995eaccb4c75e2668f0c7ad7852980c02c0216fdb0c5cb99471d8a"} Oct 09 07:58:21 crc kubenswrapper[4715]: I1009 07:58:21.755170 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:21 crc kubenswrapper[4715]: I1009 07:58:21.809164 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gzm2\" (UniqueName: \"kubernetes.io/projected/f0cfb14d-8aa5-4841-9f91-dd632d372e18-kube-api-access-7gzm2\") pod \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\" (UID: \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\") " Oct 09 07:58:21 crc kubenswrapper[4715]: I1009 07:58:21.809283 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0cfb14d-8aa5-4841-9f91-dd632d372e18-bundle\") pod \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\" (UID: \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\") " Oct 09 07:58:21 crc kubenswrapper[4715]: I1009 07:58:21.809336 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0cfb14d-8aa5-4841-9f91-dd632d372e18-util\") pod \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\" (UID: \"f0cfb14d-8aa5-4841-9f91-dd632d372e18\") " Oct 09 07:58:21 crc kubenswrapper[4715]: I1009 07:58:21.810986 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0cfb14d-8aa5-4841-9f91-dd632d372e18-bundle" (OuterVolumeSpecName: "bundle") pod "f0cfb14d-8aa5-4841-9f91-dd632d372e18" (UID: "f0cfb14d-8aa5-4841-9f91-dd632d372e18"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:58:21 crc kubenswrapper[4715]: I1009 07:58:21.815554 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0cfb14d-8aa5-4841-9f91-dd632d372e18-kube-api-access-7gzm2" (OuterVolumeSpecName: "kube-api-access-7gzm2") pod "f0cfb14d-8aa5-4841-9f91-dd632d372e18" (UID: "f0cfb14d-8aa5-4841-9f91-dd632d372e18"). InnerVolumeSpecName "kube-api-access-7gzm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:58:21 crc kubenswrapper[4715]: I1009 07:58:21.832362 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0cfb14d-8aa5-4841-9f91-dd632d372e18-util" (OuterVolumeSpecName: "util") pod "f0cfb14d-8aa5-4841-9f91-dd632d372e18" (UID: "f0cfb14d-8aa5-4841-9f91-dd632d372e18"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:58:21 crc kubenswrapper[4715]: I1009 07:58:21.911098 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gzm2\" (UniqueName: \"kubernetes.io/projected/f0cfb14d-8aa5-4841-9f91-dd632d372e18-kube-api-access-7gzm2\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:21 crc kubenswrapper[4715]: I1009 07:58:21.911135 4715 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0cfb14d-8aa5-4841-9f91-dd632d372e18-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:21 crc kubenswrapper[4715]: I1009 07:58:21.911144 4715 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0cfb14d-8aa5-4841-9f91-dd632d372e18-util\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:22 crc kubenswrapper[4715]: I1009 07:58:22.543269 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" event={"ID":"f0cfb14d-8aa5-4841-9f91-dd632d372e18","Type":"ContainerDied","Data":"66970d1f3350857bfc1c59c8b099376aa3fcd7d5655172a97edfb0ef99723b0e"} Oct 09 07:58:22 crc kubenswrapper[4715]: I1009 07:58:22.543354 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66970d1f3350857bfc1c59c8b099376aa3fcd7d5655172a97edfb0ef99723b0e" Oct 09 07:58:22 crc kubenswrapper[4715]: I1009 07:58:22.543552 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.479189 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-798678874c-cjtmr"] Oct 09 07:58:31 crc kubenswrapper[4715]: E1009 07:58:31.480044 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1c9983-60a8-4db2-866c-15deb7220cb9" containerName="console" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.480061 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1c9983-60a8-4db2-866c-15deb7220cb9" containerName="console" Oct 09 07:58:31 crc kubenswrapper[4715]: E1009 07:58:31.480074 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0cfb14d-8aa5-4841-9f91-dd632d372e18" containerName="util" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.480081 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0cfb14d-8aa5-4841-9f91-dd632d372e18" containerName="util" Oct 09 07:58:31 crc kubenswrapper[4715]: E1009 07:58:31.480102 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0cfb14d-8aa5-4841-9f91-dd632d372e18" containerName="pull" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.480110 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0cfb14d-8aa5-4841-9f91-dd632d372e18" containerName="pull" Oct 09 07:58:31 crc kubenswrapper[4715]: E1009 07:58:31.480121 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0cfb14d-8aa5-4841-9f91-dd632d372e18" containerName="extract" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.480129 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0cfb14d-8aa5-4841-9f91-dd632d372e18" containerName="extract" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.480251 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0cfb14d-8aa5-4841-9f91-dd632d372e18" containerName="extract" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.480262 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c1c9983-60a8-4db2-866c-15deb7220cb9" containerName="console" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.480787 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.483321 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-xmxzw" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.483356 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.483364 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.483321 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.484149 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.493087 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-798678874c-cjtmr"] Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.545900 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6401cdb7-5b3a-4ae5-8944-fc923060aa09-apiservice-cert\") pod \"metallb-operator-controller-manager-798678874c-cjtmr\" (UID: \"6401cdb7-5b3a-4ae5-8944-fc923060aa09\") " pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.545969 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdxmf\" (UniqueName: \"kubernetes.io/projected/6401cdb7-5b3a-4ae5-8944-fc923060aa09-kube-api-access-hdxmf\") pod \"metallb-operator-controller-manager-798678874c-cjtmr\" (UID: \"6401cdb7-5b3a-4ae5-8944-fc923060aa09\") " pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.546244 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6401cdb7-5b3a-4ae5-8944-fc923060aa09-webhook-cert\") pod \"metallb-operator-controller-manager-798678874c-cjtmr\" (UID: \"6401cdb7-5b3a-4ae5-8944-fc923060aa09\") " pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.647055 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6401cdb7-5b3a-4ae5-8944-fc923060aa09-webhook-cert\") pod \"metallb-operator-controller-manager-798678874c-cjtmr\" (UID: \"6401cdb7-5b3a-4ae5-8944-fc923060aa09\") " pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.647123 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6401cdb7-5b3a-4ae5-8944-fc923060aa09-apiservice-cert\") pod \"metallb-operator-controller-manager-798678874c-cjtmr\" (UID: \"6401cdb7-5b3a-4ae5-8944-fc923060aa09\") " pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.647158 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdxmf\" (UniqueName: \"kubernetes.io/projected/6401cdb7-5b3a-4ae5-8944-fc923060aa09-kube-api-access-hdxmf\") pod \"metallb-operator-controller-manager-798678874c-cjtmr\" (UID: \"6401cdb7-5b3a-4ae5-8944-fc923060aa09\") " pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.654602 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6401cdb7-5b3a-4ae5-8944-fc923060aa09-apiservice-cert\") pod \"metallb-operator-controller-manager-798678874c-cjtmr\" (UID: \"6401cdb7-5b3a-4ae5-8944-fc923060aa09\") " pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.664324 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6401cdb7-5b3a-4ae5-8944-fc923060aa09-webhook-cert\") pod \"metallb-operator-controller-manager-798678874c-cjtmr\" (UID: \"6401cdb7-5b3a-4ae5-8944-fc923060aa09\") " pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.670033 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdxmf\" (UniqueName: \"kubernetes.io/projected/6401cdb7-5b3a-4ae5-8944-fc923060aa09-kube-api-access-hdxmf\") pod \"metallb-operator-controller-manager-798678874c-cjtmr\" (UID: \"6401cdb7-5b3a-4ae5-8944-fc923060aa09\") " pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.798115 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.820664 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-865475978c-t8h2f"] Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.821540 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.823270 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.823535 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7f69k" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.823713 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.834841 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-865475978c-t8h2f"] Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.950781 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2b80db73-440c-49ef-8b24-187a67aab5db-apiservice-cert\") pod \"metallb-operator-webhook-server-865475978c-t8h2f\" (UID: \"2b80db73-440c-49ef-8b24-187a67aab5db\") " pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.951109 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2b80db73-440c-49ef-8b24-187a67aab5db-webhook-cert\") pod \"metallb-operator-webhook-server-865475978c-t8h2f\" (UID: \"2b80db73-440c-49ef-8b24-187a67aab5db\") " pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:58:31 crc kubenswrapper[4715]: I1009 07:58:31.951145 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-957h7\" (UniqueName: \"kubernetes.io/projected/2b80db73-440c-49ef-8b24-187a67aab5db-kube-api-access-957h7\") pod \"metallb-operator-webhook-server-865475978c-t8h2f\" (UID: \"2b80db73-440c-49ef-8b24-187a67aab5db\") " pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:58:32 crc kubenswrapper[4715]: I1009 07:58:32.042236 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-798678874c-cjtmr"] Oct 09 07:58:32 crc kubenswrapper[4715]: I1009 07:58:32.053962 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2b80db73-440c-49ef-8b24-187a67aab5db-apiservice-cert\") pod \"metallb-operator-webhook-server-865475978c-t8h2f\" (UID: \"2b80db73-440c-49ef-8b24-187a67aab5db\") " pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:58:32 crc kubenswrapper[4715]: I1009 07:58:32.054022 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2b80db73-440c-49ef-8b24-187a67aab5db-webhook-cert\") pod \"metallb-operator-webhook-server-865475978c-t8h2f\" (UID: \"2b80db73-440c-49ef-8b24-187a67aab5db\") " pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:58:32 crc kubenswrapper[4715]: I1009 07:58:32.054044 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-957h7\" (UniqueName: \"kubernetes.io/projected/2b80db73-440c-49ef-8b24-187a67aab5db-kube-api-access-957h7\") pod \"metallb-operator-webhook-server-865475978c-t8h2f\" (UID: \"2b80db73-440c-49ef-8b24-187a67aab5db\") " pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:58:32 crc kubenswrapper[4715]: I1009 07:58:32.059759 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2b80db73-440c-49ef-8b24-187a67aab5db-apiservice-cert\") pod \"metallb-operator-webhook-server-865475978c-t8h2f\" (UID: \"2b80db73-440c-49ef-8b24-187a67aab5db\") " pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:58:32 crc kubenswrapper[4715]: I1009 07:58:32.062343 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2b80db73-440c-49ef-8b24-187a67aab5db-webhook-cert\") pod \"metallb-operator-webhook-server-865475978c-t8h2f\" (UID: \"2b80db73-440c-49ef-8b24-187a67aab5db\") " pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:58:32 crc kubenswrapper[4715]: I1009 07:58:32.071796 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-957h7\" (UniqueName: \"kubernetes.io/projected/2b80db73-440c-49ef-8b24-187a67aab5db-kube-api-access-957h7\") pod \"metallb-operator-webhook-server-865475978c-t8h2f\" (UID: \"2b80db73-440c-49ef-8b24-187a67aab5db\") " pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:58:32 crc kubenswrapper[4715]: I1009 07:58:32.203906 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:58:32 crc kubenswrapper[4715]: I1009 07:58:32.443825 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-865475978c-t8h2f"] Oct 09 07:58:32 crc kubenswrapper[4715]: W1009 07:58:32.457121 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b80db73_440c_49ef_8b24_187a67aab5db.slice/crio-8535c2ce864aa51004d801b7bce5b32835d568b05d494b63ba7a625b0f736f81 WatchSource:0}: Error finding container 8535c2ce864aa51004d801b7bce5b32835d568b05d494b63ba7a625b0f736f81: Status 404 returned error can't find the container with id 8535c2ce864aa51004d801b7bce5b32835d568b05d494b63ba7a625b0f736f81 Oct 09 07:58:32 crc kubenswrapper[4715]: I1009 07:58:32.592923 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" event={"ID":"6401cdb7-5b3a-4ae5-8944-fc923060aa09","Type":"ContainerStarted","Data":"0e127eb7cecd3624bb401a6f9176f0c21c03941bb84c2c3a92ac626f3b274b97"} Oct 09 07:58:32 crc kubenswrapper[4715]: I1009 07:58:32.593984 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" event={"ID":"2b80db73-440c-49ef-8b24-187a67aab5db","Type":"ContainerStarted","Data":"8535c2ce864aa51004d801b7bce5b32835d568b05d494b63ba7a625b0f736f81"} Oct 09 07:58:37 crc kubenswrapper[4715]: I1009 07:58:37.623745 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" event={"ID":"6401cdb7-5b3a-4ae5-8944-fc923060aa09","Type":"ContainerStarted","Data":"121b21bd65a90ee4bc16080f18e0eb53910e37f4e67ba4ace4b5541a44d417fa"} Oct 09 07:58:37 crc kubenswrapper[4715]: I1009 07:58:37.624085 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:58:37 crc kubenswrapper[4715]: I1009 07:58:37.625483 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" event={"ID":"2b80db73-440c-49ef-8b24-187a67aab5db","Type":"ContainerStarted","Data":"764d50706f0b2c41da30fd42d85ff95eed3ac1c3d2ae2f24808b3b495c6d9bfe"} Oct 09 07:58:37 crc kubenswrapper[4715]: I1009 07:58:37.625621 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:58:37 crc kubenswrapper[4715]: I1009 07:58:37.645189 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" podStartSLOduration=1.585539449 podStartE2EDuration="6.645170143s" podCreationTimestamp="2025-10-09 07:58:31 +0000 UTC" firstStartedPulling="2025-10-09 07:58:32.050249468 +0000 UTC m=+742.743053476" lastFinishedPulling="2025-10-09 07:58:37.109880162 +0000 UTC m=+747.802684170" observedRunningTime="2025-10-09 07:58:37.642808684 +0000 UTC m=+748.335612692" watchObservedRunningTime="2025-10-09 07:58:37.645170143 +0000 UTC m=+748.337974151" Oct 09 07:58:37 crc kubenswrapper[4715]: I1009 07:58:37.672370 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" podStartSLOduration=2.001019956 podStartE2EDuration="6.672348786s" podCreationTimestamp="2025-10-09 07:58:31 +0000 UTC" firstStartedPulling="2025-10-09 07:58:32.459574095 +0000 UTC m=+743.152378103" lastFinishedPulling="2025-10-09 07:58:37.130902935 +0000 UTC m=+747.823706933" observedRunningTime="2025-10-09 07:58:37.67043977 +0000 UTC m=+748.363243778" watchObservedRunningTime="2025-10-09 07:58:37.672348786 +0000 UTC m=+748.365152794" Oct 09 07:58:37 crc kubenswrapper[4715]: I1009 07:58:37.748510 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xf4mc"] Oct 09 07:58:37 crc kubenswrapper[4715]: I1009 07:58:37.748804 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" podUID="d6e519e5-cb0b-40a4-a419-546ac0a3de69" containerName="controller-manager" containerID="cri-o://bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd" gracePeriod=30 Oct 09 07:58:37 crc kubenswrapper[4715]: I1009 07:58:37.764521 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46"] Oct 09 07:58:37 crc kubenswrapper[4715]: I1009 07:58:37.764840 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" podUID="f34e53e6-d25e-4619-8b73-8b9486c531eb" containerName="route-controller-manager" containerID="cri-o://5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55" gracePeriod=30 Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.306809 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.311100 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.443554 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34e53e6-d25e-4619-8b73-8b9486c531eb-config\") pod \"f34e53e6-d25e-4619-8b73-8b9486c531eb\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.443602 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-config\") pod \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.443643 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsd9j\" (UniqueName: \"kubernetes.io/projected/d6e519e5-cb0b-40a4-a419-546ac0a3de69-kube-api-access-bsd9j\") pod \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.443671 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e519e5-cb0b-40a4-a419-546ac0a3de69-serving-cert\") pod \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.443703 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-client-ca\") pod \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.443722 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f34e53e6-d25e-4619-8b73-8b9486c531eb-client-ca\") pod \"f34e53e6-d25e-4619-8b73-8b9486c531eb\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.443751 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f34e53e6-d25e-4619-8b73-8b9486c531eb-serving-cert\") pod \"f34e53e6-d25e-4619-8b73-8b9486c531eb\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.443772 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-proxy-ca-bundles\") pod \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\" (UID: \"d6e519e5-cb0b-40a4-a419-546ac0a3de69\") " Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.443793 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4n7q\" (UniqueName: \"kubernetes.io/projected/f34e53e6-d25e-4619-8b73-8b9486c531eb-kube-api-access-z4n7q\") pod \"f34e53e6-d25e-4619-8b73-8b9486c531eb\" (UID: \"f34e53e6-d25e-4619-8b73-8b9486c531eb\") " Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.444560 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f34e53e6-d25e-4619-8b73-8b9486c531eb-client-ca" (OuterVolumeSpecName: "client-ca") pod "f34e53e6-d25e-4619-8b73-8b9486c531eb" (UID: "f34e53e6-d25e-4619-8b73-8b9486c531eb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.444625 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-client-ca" (OuterVolumeSpecName: "client-ca") pod "d6e519e5-cb0b-40a4-a419-546ac0a3de69" (UID: "d6e519e5-cb0b-40a4-a419-546ac0a3de69"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.444669 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d6e519e5-cb0b-40a4-a419-546ac0a3de69" (UID: "d6e519e5-cb0b-40a4-a419-546ac0a3de69"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.445071 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f34e53e6-d25e-4619-8b73-8b9486c531eb-config" (OuterVolumeSpecName: "config") pod "f34e53e6-d25e-4619-8b73-8b9486c531eb" (UID: "f34e53e6-d25e-4619-8b73-8b9486c531eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.445378 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-config" (OuterVolumeSpecName: "config") pod "d6e519e5-cb0b-40a4-a419-546ac0a3de69" (UID: "d6e519e5-cb0b-40a4-a419-546ac0a3de69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.452644 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e519e5-cb0b-40a4-a419-546ac0a3de69-kube-api-access-bsd9j" (OuterVolumeSpecName: "kube-api-access-bsd9j") pod "d6e519e5-cb0b-40a4-a419-546ac0a3de69" (UID: "d6e519e5-cb0b-40a4-a419-546ac0a3de69"). InnerVolumeSpecName "kube-api-access-bsd9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.452705 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34e53e6-d25e-4619-8b73-8b9486c531eb-kube-api-access-z4n7q" (OuterVolumeSpecName: "kube-api-access-z4n7q") pod "f34e53e6-d25e-4619-8b73-8b9486c531eb" (UID: "f34e53e6-d25e-4619-8b73-8b9486c531eb"). InnerVolumeSpecName "kube-api-access-z4n7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.453829 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e519e5-cb0b-40a4-a419-546ac0a3de69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d6e519e5-cb0b-40a4-a419-546ac0a3de69" (UID: "d6e519e5-cb0b-40a4-a419-546ac0a3de69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.455048 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34e53e6-d25e-4619-8b73-8b9486c531eb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f34e53e6-d25e-4619-8b73-8b9486c531eb" (UID: "f34e53e6-d25e-4619-8b73-8b9486c531eb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.545601 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34e53e6-d25e-4619-8b73-8b9486c531eb-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.545640 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-config\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.545649 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsd9j\" (UniqueName: \"kubernetes.io/projected/d6e519e5-cb0b-40a4-a419-546ac0a3de69-kube-api-access-bsd9j\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.545662 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e519e5-cb0b-40a4-a419-546ac0a3de69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.545671 4715 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.545679 4715 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f34e53e6-d25e-4619-8b73-8b9486c531eb-client-ca\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.545687 4715 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f34e53e6-d25e-4619-8b73-8b9486c531eb-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.545694 4715 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d6e519e5-cb0b-40a4-a419-546ac0a3de69-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.545704 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4n7q\" (UniqueName: \"kubernetes.io/projected/f34e53e6-d25e-4619-8b73-8b9486c531eb-kube-api-access-z4n7q\") on node \"crc\" DevicePath \"\"" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.631944 4715 generic.go:334] "Generic (PLEG): container finished" podID="d6e519e5-cb0b-40a4-a419-546ac0a3de69" containerID="bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd" exitCode=0 Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.632016 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.632034 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" event={"ID":"d6e519e5-cb0b-40a4-a419-546ac0a3de69","Type":"ContainerDied","Data":"bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd"} Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.632085 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xf4mc" event={"ID":"d6e519e5-cb0b-40a4-a419-546ac0a3de69","Type":"ContainerDied","Data":"0a6adfec2af2a3af63d4ff62fe19b60f2df022734841a166b3c60e5fbb3f2e78"} Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.632104 4715 scope.go:117] "RemoveContainer" containerID="bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.634128 4715 generic.go:334] "Generic (PLEG): container finished" podID="f34e53e6-d25e-4619-8b73-8b9486c531eb" containerID="5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55" exitCode=0 Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.635083 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.637111 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" event={"ID":"f34e53e6-d25e-4619-8b73-8b9486c531eb","Type":"ContainerDied","Data":"5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55"} Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.637174 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46" event={"ID":"f34e53e6-d25e-4619-8b73-8b9486c531eb","Type":"ContainerDied","Data":"aa6f34a7d3fa206154269554e3bc8d8ace43d9e7f459eab8e803ff51ea7defb6"} Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.658696 4715 scope.go:117] "RemoveContainer" containerID="bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd" Oct 09 07:58:38 crc kubenswrapper[4715]: E1009 07:58:38.659145 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd\": container with ID starting with bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd not found: ID does not exist" containerID="bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.659186 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd"} err="failed to get container status \"bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd\": rpc error: code = NotFound desc = could not find container \"bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd\": container with ID starting with bdaf1865f422cea759fd20723d9fb3a4b66fe2fb762ac74bf0bce52e0263c0bd not found: ID does not exist" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.659245 4715 scope.go:117] "RemoveContainer" containerID="5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.678174 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xf4mc"] Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.686283 4715 scope.go:117] "RemoveContainer" containerID="5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55" Oct 09 07:58:38 crc kubenswrapper[4715]: E1009 07:58:38.687007 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55\": container with ID starting with 5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55 not found: ID does not exist" containerID="5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.687057 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55"} err="failed to get container status \"5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55\": rpc error: code = NotFound desc = could not find container \"5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55\": container with ID starting with 5f0fb99106596a96946e38fafce84718e93b176979c4afc8587c1f0d95a34a55 not found: ID does not exist" Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.687665 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xf4mc"] Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.691294 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46"] Oct 09 07:58:38 crc kubenswrapper[4715]: I1009 07:58:38.695359 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8mn46"] Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.762983 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7db49f8b9c-82tvr"] Oct 09 07:58:39 crc kubenswrapper[4715]: E1009 07:58:39.763946 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e519e5-cb0b-40a4-a419-546ac0a3de69" containerName="controller-manager" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.763971 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e519e5-cb0b-40a4-a419-546ac0a3de69" containerName="controller-manager" Oct 09 07:58:39 crc kubenswrapper[4715]: E1009 07:58:39.763993 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34e53e6-d25e-4619-8b73-8b9486c531eb" containerName="route-controller-manager" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.764005 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34e53e6-d25e-4619-8b73-8b9486c531eb" containerName="route-controller-manager" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.764192 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34e53e6-d25e-4619-8b73-8b9486c531eb" containerName="route-controller-manager" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.764215 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e519e5-cb0b-40a4-a419-546ac0a3de69" containerName="controller-manager" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.764906 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.768633 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.768706 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b"] Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.769555 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.773207 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.773664 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.773664 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.773850 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.773873 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.774207 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.774610 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.774840 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.775803 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.776045 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.776798 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.781439 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.786849 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7db49f8b9c-82tvr"] Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.790279 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b"] Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.863189 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db981626-630f-4142-8b55-5c6df534cec6-config\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.863232 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0150bac6-e57d-44f2-86de-a6a997c6d99f-config\") pod \"route-controller-manager-6c9bff5cdf-9jd9b\" (UID: \"0150bac6-e57d-44f2-86de-a6a997c6d99f\") " pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.863254 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db981626-630f-4142-8b55-5c6df534cec6-client-ca\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.863273 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6kgw\" (UniqueName: \"kubernetes.io/projected/0150bac6-e57d-44f2-86de-a6a997c6d99f-kube-api-access-g6kgw\") pod \"route-controller-manager-6c9bff5cdf-9jd9b\" (UID: \"0150bac6-e57d-44f2-86de-a6a997c6d99f\") " pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.863295 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59r2f\" (UniqueName: \"kubernetes.io/projected/db981626-630f-4142-8b55-5c6df534cec6-kube-api-access-59r2f\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.863310 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db981626-630f-4142-8b55-5c6df534cec6-proxy-ca-bundles\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.863328 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0150bac6-e57d-44f2-86de-a6a997c6d99f-serving-cert\") pod \"route-controller-manager-6c9bff5cdf-9jd9b\" (UID: \"0150bac6-e57d-44f2-86de-a6a997c6d99f\") " pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.863355 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db981626-630f-4142-8b55-5c6df534cec6-serving-cert\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.863394 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0150bac6-e57d-44f2-86de-a6a997c6d99f-client-ca\") pod \"route-controller-manager-6c9bff5cdf-9jd9b\" (UID: \"0150bac6-e57d-44f2-86de-a6a997c6d99f\") " pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.964432 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0150bac6-e57d-44f2-86de-a6a997c6d99f-client-ca\") pod \"route-controller-manager-6c9bff5cdf-9jd9b\" (UID: \"0150bac6-e57d-44f2-86de-a6a997c6d99f\") " pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.964497 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db981626-630f-4142-8b55-5c6df534cec6-config\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.964522 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0150bac6-e57d-44f2-86de-a6a997c6d99f-config\") pod \"route-controller-manager-6c9bff5cdf-9jd9b\" (UID: \"0150bac6-e57d-44f2-86de-a6a997c6d99f\") " pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.964540 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db981626-630f-4142-8b55-5c6df534cec6-client-ca\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.964564 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6kgw\" (UniqueName: \"kubernetes.io/projected/0150bac6-e57d-44f2-86de-a6a997c6d99f-kube-api-access-g6kgw\") pod \"route-controller-manager-6c9bff5cdf-9jd9b\" (UID: \"0150bac6-e57d-44f2-86de-a6a997c6d99f\") " pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.964588 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59r2f\" (UniqueName: \"kubernetes.io/projected/db981626-630f-4142-8b55-5c6df534cec6-kube-api-access-59r2f\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.964603 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db981626-630f-4142-8b55-5c6df534cec6-proxy-ca-bundles\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.964619 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0150bac6-e57d-44f2-86de-a6a997c6d99f-serving-cert\") pod \"route-controller-manager-6c9bff5cdf-9jd9b\" (UID: \"0150bac6-e57d-44f2-86de-a6a997c6d99f\") " pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.964645 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db981626-630f-4142-8b55-5c6df534cec6-serving-cert\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.965958 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0150bac6-e57d-44f2-86de-a6a997c6d99f-client-ca\") pod \"route-controller-manager-6c9bff5cdf-9jd9b\" (UID: \"0150bac6-e57d-44f2-86de-a6a997c6d99f\") " pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.966114 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db981626-630f-4142-8b55-5c6df534cec6-config\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.966210 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db981626-630f-4142-8b55-5c6df534cec6-proxy-ca-bundles\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.966726 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0150bac6-e57d-44f2-86de-a6a997c6d99f-config\") pod \"route-controller-manager-6c9bff5cdf-9jd9b\" (UID: \"0150bac6-e57d-44f2-86de-a6a997c6d99f\") " pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.966910 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db981626-630f-4142-8b55-5c6df534cec6-client-ca\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.973339 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0150bac6-e57d-44f2-86de-a6a997c6d99f-serving-cert\") pod \"route-controller-manager-6c9bff5cdf-9jd9b\" (UID: \"0150bac6-e57d-44f2-86de-a6a997c6d99f\") " pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.987182 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db981626-630f-4142-8b55-5c6df534cec6-serving-cert\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.992235 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6kgw\" (UniqueName: \"kubernetes.io/projected/0150bac6-e57d-44f2-86de-a6a997c6d99f-kube-api-access-g6kgw\") pod \"route-controller-manager-6c9bff5cdf-9jd9b\" (UID: \"0150bac6-e57d-44f2-86de-a6a997c6d99f\") " pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:39 crc kubenswrapper[4715]: I1009 07:58:39.993450 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59r2f\" (UniqueName: \"kubernetes.io/projected/db981626-630f-4142-8b55-5c6df534cec6-kube-api-access-59r2f\") pod \"controller-manager-7db49f8b9c-82tvr\" (UID: \"db981626-630f-4142-8b55-5c6df534cec6\") " pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.090153 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.103874 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.147825 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e519e5-cb0b-40a4-a419-546ac0a3de69" path="/var/lib/kubelet/pods/d6e519e5-cb0b-40a4-a419-546ac0a3de69/volumes" Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.149072 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34e53e6-d25e-4619-8b73-8b9486c531eb" path="/var/lib/kubelet/pods/f34e53e6-d25e-4619-8b73-8b9486c531eb/volumes" Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.301335 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7db49f8b9c-82tvr"] Oct 09 07:58:40 crc kubenswrapper[4715]: W1009 07:58:40.311061 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb981626_630f_4142_8b55_5c6df534cec6.slice/crio-2100562504d198e539a706949a36e2f3fdc9e15b70c1e87f86ecf9db819190cc WatchSource:0}: Error finding container 2100562504d198e539a706949a36e2f3fdc9e15b70c1e87f86ecf9db819190cc: Status 404 returned error can't find the container with id 2100562504d198e539a706949a36e2f3fdc9e15b70c1e87f86ecf9db819190cc Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.341809 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b"] Oct 09 07:58:40 crc kubenswrapper[4715]: W1009 07:58:40.354166 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0150bac6_e57d_44f2_86de_a6a997c6d99f.slice/crio-2f62c115b6a48e31531c3136396afd03b949708de2a7edaa10d7698568d90229 WatchSource:0}: Error finding container 2f62c115b6a48e31531c3136396afd03b949708de2a7edaa10d7698568d90229: Status 404 returned error can't find the container with id 2f62c115b6a48e31531c3136396afd03b949708de2a7edaa10d7698568d90229 Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.651579 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" event={"ID":"db981626-630f-4142-8b55-5c6df534cec6","Type":"ContainerStarted","Data":"4ee9eb01f730e7bff1ed48b5516f03ed542f41af88b782d1ec4b180397ce9baa"} Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.651915 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.651935 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" event={"ID":"db981626-630f-4142-8b55-5c6df534cec6","Type":"ContainerStarted","Data":"2100562504d198e539a706949a36e2f3fdc9e15b70c1e87f86ecf9db819190cc"} Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.653177 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" event={"ID":"0150bac6-e57d-44f2-86de-a6a997c6d99f","Type":"ContainerStarted","Data":"8230b015972292a76718258b3a76276224c4bf816b94f98c516f49b2918a1c97"} Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.653204 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" event={"ID":"0150bac6-e57d-44f2-86de-a6a997c6d99f","Type":"ContainerStarted","Data":"2f62c115b6a48e31531c3136396afd03b949708de2a7edaa10d7698568d90229"} Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.653382 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.678024 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" podStartSLOduration=3.678000949 podStartE2EDuration="3.678000949s" podCreationTimestamp="2025-10-09 07:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:58:40.675502376 +0000 UTC m=+751.368306404" watchObservedRunningTime="2025-10-09 07:58:40.678000949 +0000 UTC m=+751.370804957" Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.691838 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7db49f8b9c-82tvr" Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.717793 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" podStartSLOduration=3.717768709 podStartE2EDuration="3.717768709s" podCreationTimestamp="2025-10-09 07:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:58:40.714849864 +0000 UTC m=+751.407653892" watchObservedRunningTime="2025-10-09 07:58:40.717768709 +0000 UTC m=+751.410572717" Oct 09 07:58:40 crc kubenswrapper[4715]: I1009 07:58:40.948208 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c9bff5cdf-9jd9b" Oct 09 07:58:45 crc kubenswrapper[4715]: I1009 07:58:45.030653 4715 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 09 07:58:46 crc kubenswrapper[4715]: I1009 07:58:46.753551 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 07:58:46 crc kubenswrapper[4715]: I1009 07:58:46.753644 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 07:58:52 crc kubenswrapper[4715]: I1009 07:58:52.210083 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-865475978c-t8h2f" Oct 09 07:59:11 crc kubenswrapper[4715]: I1009 07:59:11.801057 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-798678874c-cjtmr" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.477009 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-vznzc"] Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.479716 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.483215 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.483856 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-hq6n5" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.486220 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.493523 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x"] Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.494505 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.498860 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.510337 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x"] Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.563179 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-js45j"] Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.564354 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-js45j" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.568801 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.568812 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.568853 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.569239 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vwsg5" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.586441 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-plvwr"] Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.587524 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.589469 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.603757 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-plvwr"] Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.617019 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-metrics-certs\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.617082 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kfsl\" (UniqueName: \"kubernetes.io/projected/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-kube-api-access-6kfsl\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.617134 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-frr-sockets\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.617161 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-frr-startup\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.617183 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-frr-conf\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.617200 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-reloader\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.617231 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s9pv\" (UniqueName: \"kubernetes.io/projected/01a59281-8feb-446a-b861-fba9e4e8df7d-kube-api-access-8s9pv\") pod \"frr-k8s-webhook-server-64bf5d555-fs56x\" (UID: \"01a59281-8feb-446a-b861-fba9e4e8df7d\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.617262 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-metrics\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.617289 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01a59281-8feb-446a-b861-fba9e4e8df7d-cert\") pod \"frr-k8s-webhook-server-64bf5d555-fs56x\" (UID: \"01a59281-8feb-446a-b861-fba9e4e8df7d\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719181 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s9pv\" (UniqueName: \"kubernetes.io/projected/01a59281-8feb-446a-b861-fba9e4e8df7d-kube-api-access-8s9pv\") pod \"frr-k8s-webhook-server-64bf5d555-fs56x\" (UID: \"01a59281-8feb-446a-b861-fba9e4e8df7d\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719255 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b391f0-e6c7-412a-8333-530a9ad5bab3-metrics-certs\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719281 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-metrics\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719306 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01a59281-8feb-446a-b861-fba9e4e8df7d-cert\") pod \"frr-k8s-webhook-server-64bf5d555-fs56x\" (UID: \"01a59281-8feb-446a-b861-fba9e4e8df7d\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719328 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a5b391f0-e6c7-412a-8333-530a9ad5bab3-metallb-excludel2\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719367 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-metrics-certs\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719390 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc22841d-c047-4e47-a235-9025efe5d30e-cert\") pod \"controller-68d546b9d8-plvwr\" (UID: \"dc22841d-c047-4e47-a235-9025efe5d30e\") " pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719434 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kfsl\" (UniqueName: \"kubernetes.io/projected/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-kube-api-access-6kfsl\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719461 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc22841d-c047-4e47-a235-9025efe5d30e-metrics-certs\") pod \"controller-68d546b9d8-plvwr\" (UID: \"dc22841d-c047-4e47-a235-9025efe5d30e\") " pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719527 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4b9k\" (UniqueName: \"kubernetes.io/projected/dc22841d-c047-4e47-a235-9025efe5d30e-kube-api-access-x4b9k\") pod \"controller-68d546b9d8-plvwr\" (UID: \"dc22841d-c047-4e47-a235-9025efe5d30e\") " pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719552 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkgq6\" (UniqueName: \"kubernetes.io/projected/a5b391f0-e6c7-412a-8333-530a9ad5bab3-kube-api-access-gkgq6\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719577 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-frr-sockets\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719603 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-frr-startup\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719619 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-frr-conf\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719637 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-reloader\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.719663 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a5b391f0-e6c7-412a-8333-530a9ad5bab3-memberlist\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.720528 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-metrics\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.721142 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-frr-conf\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.721179 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-reloader\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.721222 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-frr-sockets\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.721762 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-frr-startup\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.730348 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-metrics-certs\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.730394 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01a59281-8feb-446a-b861-fba9e4e8df7d-cert\") pod \"frr-k8s-webhook-server-64bf5d555-fs56x\" (UID: \"01a59281-8feb-446a-b861-fba9e4e8df7d\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.762226 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kfsl\" (UniqueName: \"kubernetes.io/projected/4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7-kube-api-access-6kfsl\") pod \"frr-k8s-vznzc\" (UID: \"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7\") " pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.762458 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s9pv\" (UniqueName: \"kubernetes.io/projected/01a59281-8feb-446a-b861-fba9e4e8df7d-kube-api-access-8s9pv\") pod \"frr-k8s-webhook-server-64bf5d555-fs56x\" (UID: \"01a59281-8feb-446a-b861-fba9e4e8df7d\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.797503 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.819670 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.820393 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4b9k\" (UniqueName: \"kubernetes.io/projected/dc22841d-c047-4e47-a235-9025efe5d30e-kube-api-access-x4b9k\") pod \"controller-68d546b9d8-plvwr\" (UID: \"dc22841d-c047-4e47-a235-9025efe5d30e\") " pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.820473 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkgq6\" (UniqueName: \"kubernetes.io/projected/a5b391f0-e6c7-412a-8333-530a9ad5bab3-kube-api-access-gkgq6\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.820509 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a5b391f0-e6c7-412a-8333-530a9ad5bab3-memberlist\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.820545 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b391f0-e6c7-412a-8333-530a9ad5bab3-metrics-certs\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.820575 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a5b391f0-e6c7-412a-8333-530a9ad5bab3-metallb-excludel2\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.820609 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc22841d-c047-4e47-a235-9025efe5d30e-cert\") pod \"controller-68d546b9d8-plvwr\" (UID: \"dc22841d-c047-4e47-a235-9025efe5d30e\") " pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.820634 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc22841d-c047-4e47-a235-9025efe5d30e-metrics-certs\") pod \"controller-68d546b9d8-plvwr\" (UID: \"dc22841d-c047-4e47-a235-9025efe5d30e\") " pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:12 crc kubenswrapper[4715]: E1009 07:59:12.820667 4715 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 09 07:59:12 crc kubenswrapper[4715]: E1009 07:59:12.820735 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5b391f0-e6c7-412a-8333-530a9ad5bab3-memberlist podName:a5b391f0-e6c7-412a-8333-530a9ad5bab3 nodeName:}" failed. No retries permitted until 2025-10-09 07:59:13.320711978 +0000 UTC m=+784.013516046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a5b391f0-e6c7-412a-8333-530a9ad5bab3-memberlist") pod "speaker-js45j" (UID: "a5b391f0-e6c7-412a-8333-530a9ad5bab3") : secret "metallb-memberlist" not found Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.821405 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a5b391f0-e6c7-412a-8333-530a9ad5bab3-metallb-excludel2\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.826741 4715 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.826781 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc22841d-c047-4e47-a235-9025efe5d30e-metrics-certs\") pod \"controller-68d546b9d8-plvwr\" (UID: \"dc22841d-c047-4e47-a235-9025efe5d30e\") " pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.826959 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a5b391f0-e6c7-412a-8333-530a9ad5bab3-metrics-certs\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.842394 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4b9k\" (UniqueName: \"kubernetes.io/projected/dc22841d-c047-4e47-a235-9025efe5d30e-kube-api-access-x4b9k\") pod \"controller-68d546b9d8-plvwr\" (UID: \"dc22841d-c047-4e47-a235-9025efe5d30e\") " pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.842771 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc22841d-c047-4e47-a235-9025efe5d30e-cert\") pod \"controller-68d546b9d8-plvwr\" (UID: \"dc22841d-c047-4e47-a235-9025efe5d30e\") " pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.851073 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkgq6\" (UniqueName: \"kubernetes.io/projected/a5b391f0-e6c7-412a-8333-530a9ad5bab3-kube-api-access-gkgq6\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:12 crc kubenswrapper[4715]: I1009 07:59:12.907286 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:13 crc kubenswrapper[4715]: I1009 07:59:13.159893 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x"] Oct 09 07:59:13 crc kubenswrapper[4715]: W1009 07:59:13.166210 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01a59281_8feb_446a_b861_fba9e4e8df7d.slice/crio-38b7a2715c5217104ff28387bcedec058ee49256acf6297fcc6338e24eba46f5 WatchSource:0}: Error finding container 38b7a2715c5217104ff28387bcedec058ee49256acf6297fcc6338e24eba46f5: Status 404 returned error can't find the container with id 38b7a2715c5217104ff28387bcedec058ee49256acf6297fcc6338e24eba46f5 Oct 09 07:59:13 crc kubenswrapper[4715]: I1009 07:59:13.327101 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a5b391f0-e6c7-412a-8333-530a9ad5bab3-memberlist\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:13 crc kubenswrapper[4715]: E1009 07:59:13.327254 4715 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 09 07:59:13 crc kubenswrapper[4715]: E1009 07:59:13.327307 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5b391f0-e6c7-412a-8333-530a9ad5bab3-memberlist podName:a5b391f0-e6c7-412a-8333-530a9ad5bab3 nodeName:}" failed. No retries permitted until 2025-10-09 07:59:14.327292473 +0000 UTC m=+785.020096481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a5b391f0-e6c7-412a-8333-530a9ad5bab3-memberlist") pod "speaker-js45j" (UID: "a5b391f0-e6c7-412a-8333-530a9ad5bab3") : secret "metallb-memberlist" not found Oct 09 07:59:13 crc kubenswrapper[4715]: I1009 07:59:13.391892 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-plvwr"] Oct 09 07:59:13 crc kubenswrapper[4715]: W1009 07:59:13.393920 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc22841d_c047_4e47_a235_9025efe5d30e.slice/crio-0ffbf23350b22367c87c39bfa103e53ef8a3241456bed0cfb1df19d97bbad769 WatchSource:0}: Error finding container 0ffbf23350b22367c87c39bfa103e53ef8a3241456bed0cfb1df19d97bbad769: Status 404 returned error can't find the container with id 0ffbf23350b22367c87c39bfa103e53ef8a3241456bed0cfb1df19d97bbad769 Oct 09 07:59:13 crc kubenswrapper[4715]: I1009 07:59:13.862288 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vznzc" event={"ID":"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7","Type":"ContainerStarted","Data":"25fc643c93415c683ac78a63cadf85fe1f891a1d069ec6f3e6b43424041b5fe3"} Oct 09 07:59:13 crc kubenswrapper[4715]: I1009 07:59:13.865275 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" event={"ID":"01a59281-8feb-446a-b861-fba9e4e8df7d","Type":"ContainerStarted","Data":"38b7a2715c5217104ff28387bcedec058ee49256acf6297fcc6338e24eba46f5"} Oct 09 07:59:13 crc kubenswrapper[4715]: I1009 07:59:13.866770 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-plvwr" event={"ID":"dc22841d-c047-4e47-a235-9025efe5d30e","Type":"ContainerStarted","Data":"badd21dfddcd197def0ba4e1f633f17a51df2195beb3dd10eea7694b9f795ed6"} Oct 09 07:59:13 crc kubenswrapper[4715]: I1009 07:59:13.866792 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-plvwr" event={"ID":"dc22841d-c047-4e47-a235-9025efe5d30e","Type":"ContainerStarted","Data":"71fbfc46d0479170a9d2f17cd7dd53a4ffdaf5720e27ece37d62fe960c260e1a"} Oct 09 07:59:13 crc kubenswrapper[4715]: I1009 07:59:13.866801 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-plvwr" event={"ID":"dc22841d-c047-4e47-a235-9025efe5d30e","Type":"ContainerStarted","Data":"0ffbf23350b22367c87c39bfa103e53ef8a3241456bed0cfb1df19d97bbad769"} Oct 09 07:59:13 crc kubenswrapper[4715]: I1009 07:59:13.867651 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:13 crc kubenswrapper[4715]: I1009 07:59:13.885677 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-plvwr" podStartSLOduration=1.8856542969999999 podStartE2EDuration="1.885654297s" podCreationTimestamp="2025-10-09 07:59:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:59:13.881911348 +0000 UTC m=+784.574715366" watchObservedRunningTime="2025-10-09 07:59:13.885654297 +0000 UTC m=+784.578458305" Oct 09 07:59:14 crc kubenswrapper[4715]: I1009 07:59:14.342250 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a5b391f0-e6c7-412a-8333-530a9ad5bab3-memberlist\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:14 crc kubenswrapper[4715]: I1009 07:59:14.352320 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a5b391f0-e6c7-412a-8333-530a9ad5bab3-memberlist\") pod \"speaker-js45j\" (UID: \"a5b391f0-e6c7-412a-8333-530a9ad5bab3\") " pod="metallb-system/speaker-js45j" Oct 09 07:59:14 crc kubenswrapper[4715]: I1009 07:59:14.380571 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-js45j" Oct 09 07:59:14 crc kubenswrapper[4715]: W1009 07:59:14.419174 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5b391f0_e6c7_412a_8333_530a9ad5bab3.slice/crio-fc1538fb83b5a7fb73e0300f23f63b70231652349f4b95f9962e7aa67bffa663 WatchSource:0}: Error finding container fc1538fb83b5a7fb73e0300f23f63b70231652349f4b95f9962e7aa67bffa663: Status 404 returned error can't find the container with id fc1538fb83b5a7fb73e0300f23f63b70231652349f4b95f9962e7aa67bffa663 Oct 09 07:59:14 crc kubenswrapper[4715]: I1009 07:59:14.873768 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-js45j" event={"ID":"a5b391f0-e6c7-412a-8333-530a9ad5bab3","Type":"ContainerStarted","Data":"5dd6f2b93a7d0b7aa9cd77d6d36209d613b4609670e4523ab521528d6362eca3"} Oct 09 07:59:14 crc kubenswrapper[4715]: I1009 07:59:14.873811 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-js45j" event={"ID":"a5b391f0-e6c7-412a-8333-530a9ad5bab3","Type":"ContainerStarted","Data":"fc1538fb83b5a7fb73e0300f23f63b70231652349f4b95f9962e7aa67bffa663"} Oct 09 07:59:15 crc kubenswrapper[4715]: I1009 07:59:15.881332 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-js45j" event={"ID":"a5b391f0-e6c7-412a-8333-530a9ad5bab3","Type":"ContainerStarted","Data":"715a5b7dc7e3b11efc72bfd504cea7d0a005fcc2f97118b1d69ae29c887e38b9"} Oct 09 07:59:15 crc kubenswrapper[4715]: I1009 07:59:15.910214 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-js45j" podStartSLOduration=3.910192285 podStartE2EDuration="3.910192285s" podCreationTimestamp="2025-10-09 07:59:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 07:59:15.905512349 +0000 UTC m=+786.598316367" watchObservedRunningTime="2025-10-09 07:59:15.910192285 +0000 UTC m=+786.602996293" Oct 09 07:59:16 crc kubenswrapper[4715]: I1009 07:59:16.753312 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 07:59:16 crc kubenswrapper[4715]: I1009 07:59:16.753386 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 07:59:16 crc kubenswrapper[4715]: I1009 07:59:16.889239 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-js45j" Oct 09 07:59:20 crc kubenswrapper[4715]: I1009 07:59:20.920896 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" event={"ID":"01a59281-8feb-446a-b861-fba9e4e8df7d","Type":"ContainerStarted","Data":"e28e19b7ea879bd9b359a20a5cc989ed5b895e9f2637277530a9fb5d57514b7b"} Oct 09 07:59:20 crc kubenswrapper[4715]: I1009 07:59:20.921636 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" Oct 09 07:59:20 crc kubenswrapper[4715]: I1009 07:59:20.922322 4715 generic.go:334] "Generic (PLEG): container finished" podID="4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7" containerID="6c2a8ad073b78d725773fe6cdf5bd54a3bae1f994fade2f84c4b59b1952db354" exitCode=0 Oct 09 07:59:20 crc kubenswrapper[4715]: I1009 07:59:20.922351 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vznzc" event={"ID":"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7","Type":"ContainerDied","Data":"6c2a8ad073b78d725773fe6cdf5bd54a3bae1f994fade2f84c4b59b1952db354"} Oct 09 07:59:20 crc kubenswrapper[4715]: I1009 07:59:20.972010 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" podStartSLOduration=1.878454638 podStartE2EDuration="8.971985808s" podCreationTimestamp="2025-10-09 07:59:12 +0000 UTC" firstStartedPulling="2025-10-09 07:59:13.168500895 +0000 UTC m=+783.861304903" lastFinishedPulling="2025-10-09 07:59:20.262032045 +0000 UTC m=+790.954836073" observedRunningTime="2025-10-09 07:59:20.942646803 +0000 UTC m=+791.635450831" watchObservedRunningTime="2025-10-09 07:59:20.971985808 +0000 UTC m=+791.664789826" Oct 09 07:59:21 crc kubenswrapper[4715]: I1009 07:59:21.931479 4715 generic.go:334] "Generic (PLEG): container finished" podID="4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7" containerID="f312a2348514fa56172f1b067836e84a0c708e87bcaa62981f7b12b5a932b6a8" exitCode=0 Oct 09 07:59:21 crc kubenswrapper[4715]: I1009 07:59:21.931539 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vznzc" event={"ID":"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7","Type":"ContainerDied","Data":"f312a2348514fa56172f1b067836e84a0c708e87bcaa62981f7b12b5a932b6a8"} Oct 09 07:59:22 crc kubenswrapper[4715]: I1009 07:59:22.939769 4715 generic.go:334] "Generic (PLEG): container finished" podID="4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7" containerID="a3613f17d8c43fcd114ffbf1aee02c3b567cadf3164d44358428ead9859959b6" exitCode=0 Oct 09 07:59:22 crc kubenswrapper[4715]: I1009 07:59:22.939819 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vznzc" event={"ID":"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7","Type":"ContainerDied","Data":"a3613f17d8c43fcd114ffbf1aee02c3b567cadf3164d44358428ead9859959b6"} Oct 09 07:59:23 crc kubenswrapper[4715]: I1009 07:59:23.954909 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vznzc" event={"ID":"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7","Type":"ContainerStarted","Data":"ecbd7e62272aedf51c8b7b6a6690312a0abec45d4c686f83972bf629bc1ca638"} Oct 09 07:59:23 crc kubenswrapper[4715]: I1009 07:59:23.955261 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vznzc" event={"ID":"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7","Type":"ContainerStarted","Data":"0cfea8495b89f81c7b245e5e0fb8c5dee9a4ff0a2820bdda0a36e5d882decbb6"} Oct 09 07:59:23 crc kubenswrapper[4715]: I1009 07:59:23.955275 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vznzc" event={"ID":"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7","Type":"ContainerStarted","Data":"e34576c4039bad975b43320ec2da4c1af9eb322963de100e1e79811a16dbcf3d"} Oct 09 07:59:23 crc kubenswrapper[4715]: I1009 07:59:23.955288 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vznzc" event={"ID":"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7","Type":"ContainerStarted","Data":"8fc6147f3c9306c85aea99e800242284f641f07e47777eb2774b5f0ddd8cd4fb"} Oct 09 07:59:23 crc kubenswrapper[4715]: I1009 07:59:23.955299 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vznzc" event={"ID":"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7","Type":"ContainerStarted","Data":"eb8c2910b1ff5e3b7f5a59c0c35a3a8ec96d9a67e361d9892f631eb7477bd4ed"} Oct 09 07:59:23 crc kubenswrapper[4715]: I1009 07:59:23.955309 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vznzc" event={"ID":"4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7","Type":"ContainerStarted","Data":"d940b73168d508f15fce4ce84ee5a4253f42243795e85c4784a7a977b9c41d4b"} Oct 09 07:59:23 crc kubenswrapper[4715]: I1009 07:59:23.955475 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:23 crc kubenswrapper[4715]: I1009 07:59:23.981598 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-vznzc" podStartSLOduration=4.72763592 podStartE2EDuration="11.981572006s" podCreationTimestamp="2025-10-09 07:59:12 +0000 UTC" firstStartedPulling="2025-10-09 07:59:12.973703257 +0000 UTC m=+783.666507265" lastFinishedPulling="2025-10-09 07:59:20.227639343 +0000 UTC m=+790.920443351" observedRunningTime="2025-10-09 07:59:23.980249847 +0000 UTC m=+794.673053875" watchObservedRunningTime="2025-10-09 07:59:23.981572006 +0000 UTC m=+794.674376014" Oct 09 07:59:24 crc kubenswrapper[4715]: I1009 07:59:24.386512 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-js45j" Oct 09 07:59:27 crc kubenswrapper[4715]: I1009 07:59:27.265043 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ml9ph"] Oct 09 07:59:27 crc kubenswrapper[4715]: I1009 07:59:27.266467 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ml9ph" Oct 09 07:59:27 crc kubenswrapper[4715]: I1009 07:59:27.295434 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-5qnb6" Oct 09 07:59:27 crc kubenswrapper[4715]: I1009 07:59:27.295439 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 09 07:59:27 crc kubenswrapper[4715]: I1009 07:59:27.298750 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 09 07:59:27 crc kubenswrapper[4715]: I1009 07:59:27.307536 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ml9ph"] Oct 09 07:59:27 crc kubenswrapper[4715]: I1009 07:59:27.333893 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2hrb\" (UniqueName: \"kubernetes.io/projected/5b6cf44b-4200-4917-b8d2-c75c30adf80a-kube-api-access-r2hrb\") pod \"openstack-operator-index-ml9ph\" (UID: \"5b6cf44b-4200-4917-b8d2-c75c30adf80a\") " pod="openstack-operators/openstack-operator-index-ml9ph" Oct 09 07:59:27 crc kubenswrapper[4715]: I1009 07:59:27.435403 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2hrb\" (UniqueName: \"kubernetes.io/projected/5b6cf44b-4200-4917-b8d2-c75c30adf80a-kube-api-access-r2hrb\") pod \"openstack-operator-index-ml9ph\" (UID: \"5b6cf44b-4200-4917-b8d2-c75c30adf80a\") " pod="openstack-operators/openstack-operator-index-ml9ph" Oct 09 07:59:27 crc kubenswrapper[4715]: I1009 07:59:27.453987 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2hrb\" (UniqueName: \"kubernetes.io/projected/5b6cf44b-4200-4917-b8d2-c75c30adf80a-kube-api-access-r2hrb\") pod \"openstack-operator-index-ml9ph\" (UID: \"5b6cf44b-4200-4917-b8d2-c75c30adf80a\") " pod="openstack-operators/openstack-operator-index-ml9ph" Oct 09 07:59:27 crc kubenswrapper[4715]: I1009 07:59:27.616059 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ml9ph" Oct 09 07:59:27 crc kubenswrapper[4715]: I1009 07:59:27.799636 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:27 crc kubenswrapper[4715]: I1009 07:59:27.845129 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:28 crc kubenswrapper[4715]: I1009 07:59:28.041855 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ml9ph"] Oct 09 07:59:28 crc kubenswrapper[4715]: I1009 07:59:28.988296 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ml9ph" event={"ID":"5b6cf44b-4200-4917-b8d2-c75c30adf80a","Type":"ContainerStarted","Data":"69c86fde3aae709dd46bdfa66f2068509ea5ae603f3475a8ef44b47ab060dfe4"} Oct 09 07:59:30 crc kubenswrapper[4715]: I1009 07:59:30.650470 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ml9ph"] Oct 09 07:59:31 crc kubenswrapper[4715]: I1009 07:59:31.003170 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ml9ph" event={"ID":"5b6cf44b-4200-4917-b8d2-c75c30adf80a","Type":"ContainerStarted","Data":"21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f"} Oct 09 07:59:31 crc kubenswrapper[4715]: I1009 07:59:31.030771 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ml9ph" podStartSLOduration=1.938092809 podStartE2EDuration="4.030743812s" podCreationTimestamp="2025-10-09 07:59:27 +0000 UTC" firstStartedPulling="2025-10-09 07:59:28.050048486 +0000 UTC m=+798.742852494" lastFinishedPulling="2025-10-09 07:59:30.142699489 +0000 UTC m=+800.835503497" observedRunningTime="2025-10-09 07:59:31.022957425 +0000 UTC m=+801.715761473" watchObservedRunningTime="2025-10-09 07:59:31.030743812 +0000 UTC m=+801.723547830" Oct 09 07:59:31 crc kubenswrapper[4715]: I1009 07:59:31.252174 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fgdkg"] Oct 09 07:59:31 crc kubenswrapper[4715]: I1009 07:59:31.253065 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fgdkg" Oct 09 07:59:31 crc kubenswrapper[4715]: I1009 07:59:31.269184 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fgdkg"] Oct 09 07:59:31 crc kubenswrapper[4715]: I1009 07:59:31.288164 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsmdf\" (UniqueName: \"kubernetes.io/projected/9809a400-b7e4-4700-bfb6-3500d2f61c96-kube-api-access-wsmdf\") pod \"openstack-operator-index-fgdkg\" (UID: \"9809a400-b7e4-4700-bfb6-3500d2f61c96\") " pod="openstack-operators/openstack-operator-index-fgdkg" Oct 09 07:59:31 crc kubenswrapper[4715]: I1009 07:59:31.389032 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsmdf\" (UniqueName: \"kubernetes.io/projected/9809a400-b7e4-4700-bfb6-3500d2f61c96-kube-api-access-wsmdf\") pod \"openstack-operator-index-fgdkg\" (UID: \"9809a400-b7e4-4700-bfb6-3500d2f61c96\") " pod="openstack-operators/openstack-operator-index-fgdkg" Oct 09 07:59:31 crc kubenswrapper[4715]: I1009 07:59:31.415813 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsmdf\" (UniqueName: \"kubernetes.io/projected/9809a400-b7e4-4700-bfb6-3500d2f61c96-kube-api-access-wsmdf\") pod \"openstack-operator-index-fgdkg\" (UID: \"9809a400-b7e4-4700-bfb6-3500d2f61c96\") " pod="openstack-operators/openstack-operator-index-fgdkg" Oct 09 07:59:31 crc kubenswrapper[4715]: I1009 07:59:31.575844 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fgdkg" Oct 09 07:59:31 crc kubenswrapper[4715]: I1009 07:59:31.987491 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fgdkg"] Oct 09 07:59:32 crc kubenswrapper[4715]: I1009 07:59:32.012046 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fgdkg" event={"ID":"9809a400-b7e4-4700-bfb6-3500d2f61c96","Type":"ContainerStarted","Data":"a1bdd802b7ab2e6872c0a79f1ea7bd7a91074fd27dad50ae3b5900c97df86ffe"} Oct 09 07:59:32 crc kubenswrapper[4715]: I1009 07:59:32.012172 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-ml9ph" podUID="5b6cf44b-4200-4917-b8d2-c75c30adf80a" containerName="registry-server" containerID="cri-o://21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f" gracePeriod=2 Oct 09 07:59:32 crc kubenswrapper[4715]: I1009 07:59:32.373357 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ml9ph" Oct 09 07:59:32 crc kubenswrapper[4715]: I1009 07:59:32.503710 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2hrb\" (UniqueName: \"kubernetes.io/projected/5b6cf44b-4200-4917-b8d2-c75c30adf80a-kube-api-access-r2hrb\") pod \"5b6cf44b-4200-4917-b8d2-c75c30adf80a\" (UID: \"5b6cf44b-4200-4917-b8d2-c75c30adf80a\") " Oct 09 07:59:32 crc kubenswrapper[4715]: I1009 07:59:32.508285 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b6cf44b-4200-4917-b8d2-c75c30adf80a-kube-api-access-r2hrb" (OuterVolumeSpecName: "kube-api-access-r2hrb") pod "5b6cf44b-4200-4917-b8d2-c75c30adf80a" (UID: "5b6cf44b-4200-4917-b8d2-c75c30adf80a"). InnerVolumeSpecName "kube-api-access-r2hrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:59:32 crc kubenswrapper[4715]: I1009 07:59:32.605681 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2hrb\" (UniqueName: \"kubernetes.io/projected/5b6cf44b-4200-4917-b8d2-c75c30adf80a-kube-api-access-r2hrb\") on node \"crc\" DevicePath \"\"" Oct 09 07:59:32 crc kubenswrapper[4715]: I1009 07:59:32.824013 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-fs56x" Oct 09 07:59:32 crc kubenswrapper[4715]: I1009 07:59:32.911264 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-plvwr" Oct 09 07:59:33 crc kubenswrapper[4715]: I1009 07:59:33.018535 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fgdkg" event={"ID":"9809a400-b7e4-4700-bfb6-3500d2f61c96","Type":"ContainerStarted","Data":"de5af3a8fc063c36cc6e538d47b6466251404c76750e6b7f1173f8c67adc3483"} Oct 09 07:59:33 crc kubenswrapper[4715]: I1009 07:59:33.020888 4715 generic.go:334] "Generic (PLEG): container finished" podID="5b6cf44b-4200-4917-b8d2-c75c30adf80a" containerID="21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f" exitCode=0 Oct 09 07:59:33 crc kubenswrapper[4715]: I1009 07:59:33.020920 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ml9ph" event={"ID":"5b6cf44b-4200-4917-b8d2-c75c30adf80a","Type":"ContainerDied","Data":"21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f"} Oct 09 07:59:33 crc kubenswrapper[4715]: I1009 07:59:33.020951 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ml9ph" event={"ID":"5b6cf44b-4200-4917-b8d2-c75c30adf80a","Type":"ContainerDied","Data":"69c86fde3aae709dd46bdfa66f2068509ea5ae603f3475a8ef44b47ab060dfe4"} Oct 09 07:59:33 crc kubenswrapper[4715]: I1009 07:59:33.020968 4715 scope.go:117] "RemoveContainer" containerID="21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f" Oct 09 07:59:33 crc kubenswrapper[4715]: I1009 07:59:33.020901 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ml9ph" Oct 09 07:59:33 crc kubenswrapper[4715]: I1009 07:59:33.032466 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fgdkg" podStartSLOduration=1.979161311 podStartE2EDuration="2.032440594s" podCreationTimestamp="2025-10-09 07:59:31 +0000 UTC" firstStartedPulling="2025-10-09 07:59:32.00267412 +0000 UTC m=+802.695478128" lastFinishedPulling="2025-10-09 07:59:32.055953403 +0000 UTC m=+802.748757411" observedRunningTime="2025-10-09 07:59:33.031795335 +0000 UTC m=+803.724599363" watchObservedRunningTime="2025-10-09 07:59:33.032440594 +0000 UTC m=+803.725244602" Oct 09 07:59:33 crc kubenswrapper[4715]: I1009 07:59:33.042765 4715 scope.go:117] "RemoveContainer" containerID="21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f" Oct 09 07:59:33 crc kubenswrapper[4715]: E1009 07:59:33.043248 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f\": container with ID starting with 21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f not found: ID does not exist" containerID="21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f" Oct 09 07:59:33 crc kubenswrapper[4715]: I1009 07:59:33.043296 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f"} err="failed to get container status \"21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f\": rpc error: code = NotFound desc = could not find container \"21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f\": container with ID starting with 21119aa4e50913572d9667705c28d3078b5369e5b079e3cbf25caf89dbd75f0f not found: ID does not exist" Oct 09 07:59:33 crc kubenswrapper[4715]: I1009 07:59:33.055017 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-ml9ph"] Oct 09 07:59:33 crc kubenswrapper[4715]: I1009 07:59:33.059265 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-ml9ph"] Oct 09 07:59:34 crc kubenswrapper[4715]: I1009 07:59:34.150555 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b6cf44b-4200-4917-b8d2-c75c30adf80a" path="/var/lib/kubelet/pods/5b6cf44b-4200-4917-b8d2-c75c30adf80a/volumes" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.654858 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zwpxk"] Oct 09 07:59:35 crc kubenswrapper[4715]: E1009 07:59:35.655121 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b6cf44b-4200-4917-b8d2-c75c30adf80a" containerName="registry-server" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.655133 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b6cf44b-4200-4917-b8d2-c75c30adf80a" containerName="registry-server" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.655242 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b6cf44b-4200-4917-b8d2-c75c30adf80a" containerName="registry-server" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.657002 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.677091 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwpxk"] Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.745371 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9vp5\" (UniqueName: \"kubernetes.io/projected/89e46f04-087a-4d41-bfa2-478dc342b0cf-kube-api-access-z9vp5\") pod \"redhat-operators-zwpxk\" (UID: \"89e46f04-087a-4d41-bfa2-478dc342b0cf\") " pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.745440 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89e46f04-087a-4d41-bfa2-478dc342b0cf-catalog-content\") pod \"redhat-operators-zwpxk\" (UID: \"89e46f04-087a-4d41-bfa2-478dc342b0cf\") " pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.745481 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89e46f04-087a-4d41-bfa2-478dc342b0cf-utilities\") pod \"redhat-operators-zwpxk\" (UID: \"89e46f04-087a-4d41-bfa2-478dc342b0cf\") " pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.846481 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9vp5\" (UniqueName: \"kubernetes.io/projected/89e46f04-087a-4d41-bfa2-478dc342b0cf-kube-api-access-z9vp5\") pod \"redhat-operators-zwpxk\" (UID: \"89e46f04-087a-4d41-bfa2-478dc342b0cf\") " pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.846521 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89e46f04-087a-4d41-bfa2-478dc342b0cf-catalog-content\") pod \"redhat-operators-zwpxk\" (UID: \"89e46f04-087a-4d41-bfa2-478dc342b0cf\") " pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.846566 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89e46f04-087a-4d41-bfa2-478dc342b0cf-utilities\") pod \"redhat-operators-zwpxk\" (UID: \"89e46f04-087a-4d41-bfa2-478dc342b0cf\") " pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.847074 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89e46f04-087a-4d41-bfa2-478dc342b0cf-utilities\") pod \"redhat-operators-zwpxk\" (UID: \"89e46f04-087a-4d41-bfa2-478dc342b0cf\") " pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.847215 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89e46f04-087a-4d41-bfa2-478dc342b0cf-catalog-content\") pod \"redhat-operators-zwpxk\" (UID: \"89e46f04-087a-4d41-bfa2-478dc342b0cf\") " pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.867898 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9vp5\" (UniqueName: \"kubernetes.io/projected/89e46f04-087a-4d41-bfa2-478dc342b0cf-kube-api-access-z9vp5\") pod \"redhat-operators-zwpxk\" (UID: \"89e46f04-087a-4d41-bfa2-478dc342b0cf\") " pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:35 crc kubenswrapper[4715]: I1009 07:59:35.987044 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:36 crc kubenswrapper[4715]: I1009 07:59:36.472228 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwpxk"] Oct 09 07:59:36 crc kubenswrapper[4715]: W1009 07:59:36.480673 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89e46f04_087a_4d41_bfa2_478dc342b0cf.slice/crio-6de9abbc7d4cf13241fc4ee955ff8d25165a0e2f251ec9c70a3f6ec3e1e495da WatchSource:0}: Error finding container 6de9abbc7d4cf13241fc4ee955ff8d25165a0e2f251ec9c70a3f6ec3e1e495da: Status 404 returned error can't find the container with id 6de9abbc7d4cf13241fc4ee955ff8d25165a0e2f251ec9c70a3f6ec3e1e495da Oct 09 07:59:37 crc kubenswrapper[4715]: I1009 07:59:37.057021 4715 generic.go:334] "Generic (PLEG): container finished" podID="89e46f04-087a-4d41-bfa2-478dc342b0cf" containerID="f6dd6dba7e3d77672dcac98d7d3ff54cb43c57d7ee2059441a42474e4912cfc1" exitCode=0 Oct 09 07:59:37 crc kubenswrapper[4715]: I1009 07:59:37.057107 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwpxk" event={"ID":"89e46f04-087a-4d41-bfa2-478dc342b0cf","Type":"ContainerDied","Data":"f6dd6dba7e3d77672dcac98d7d3ff54cb43c57d7ee2059441a42474e4912cfc1"} Oct 09 07:59:37 crc kubenswrapper[4715]: I1009 07:59:37.057373 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwpxk" event={"ID":"89e46f04-087a-4d41-bfa2-478dc342b0cf","Type":"ContainerStarted","Data":"6de9abbc7d4cf13241fc4ee955ff8d25165a0e2f251ec9c70a3f6ec3e1e495da"} Oct 09 07:59:38 crc kubenswrapper[4715]: I1009 07:59:38.069058 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwpxk" event={"ID":"89e46f04-087a-4d41-bfa2-478dc342b0cf","Type":"ContainerStarted","Data":"77476b33a91389b160966b928d269efd146fd484cfe871167a4f535eac45f3e2"} Oct 09 07:59:39 crc kubenswrapper[4715]: I1009 07:59:39.080953 4715 generic.go:334] "Generic (PLEG): container finished" podID="89e46f04-087a-4d41-bfa2-478dc342b0cf" containerID="77476b33a91389b160966b928d269efd146fd484cfe871167a4f535eac45f3e2" exitCode=0 Oct 09 07:59:39 crc kubenswrapper[4715]: I1009 07:59:39.081031 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwpxk" event={"ID":"89e46f04-087a-4d41-bfa2-478dc342b0cf","Type":"ContainerDied","Data":"77476b33a91389b160966b928d269efd146fd484cfe871167a4f535eac45f3e2"} Oct 09 07:59:40 crc kubenswrapper[4715]: I1009 07:59:40.092083 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwpxk" event={"ID":"89e46f04-087a-4d41-bfa2-478dc342b0cf","Type":"ContainerStarted","Data":"3c40c81f446bd561905e2d85a273be8b4840e645767e534a0d5763fa2ed59f61"} Oct 09 07:59:40 crc kubenswrapper[4715]: I1009 07:59:40.119651 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zwpxk" podStartSLOduration=2.266686356 podStartE2EDuration="5.119625988s" podCreationTimestamp="2025-10-09 07:59:35 +0000 UTC" firstStartedPulling="2025-10-09 07:59:37.059766295 +0000 UTC m=+807.752570313" lastFinishedPulling="2025-10-09 07:59:39.912705937 +0000 UTC m=+810.605509945" observedRunningTime="2025-10-09 07:59:40.115658652 +0000 UTC m=+810.808462660" watchObservedRunningTime="2025-10-09 07:59:40.119625988 +0000 UTC m=+810.812429996" Oct 09 07:59:41 crc kubenswrapper[4715]: I1009 07:59:41.576070 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-fgdkg" Oct 09 07:59:41 crc kubenswrapper[4715]: I1009 07:59:41.576168 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-fgdkg" Oct 09 07:59:41 crc kubenswrapper[4715]: I1009 07:59:41.610108 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-fgdkg" Oct 09 07:59:42 crc kubenswrapper[4715]: I1009 07:59:42.145376 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-fgdkg" Oct 09 07:59:42 crc kubenswrapper[4715]: I1009 07:59:42.801135 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-vznzc" Oct 09 07:59:45 crc kubenswrapper[4715]: I1009 07:59:45.988192 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:45 crc kubenswrapper[4715]: I1009 07:59:45.988506 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.028218 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.187744 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.754075 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.754174 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.754303 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.755150 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"451f2195b7f62aab0dad39fca7efb143fc4db9dbd4af35f5a099bbb88635e621"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.755249 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://451f2195b7f62aab0dad39fca7efb143fc4db9dbd4af35f5a099bbb88635e621" gracePeriod=600 Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.859767 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x6x5c"] Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.862613 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.873032 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6x5c"] Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.905230 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cj9t\" (UniqueName: \"kubernetes.io/projected/a3220b54-5aa6-4d89-8f75-8248f407be16-kube-api-access-7cj9t\") pod \"certified-operators-x6x5c\" (UID: \"a3220b54-5aa6-4d89-8f75-8248f407be16\") " pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.905303 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3220b54-5aa6-4d89-8f75-8248f407be16-utilities\") pod \"certified-operators-x6x5c\" (UID: \"a3220b54-5aa6-4d89-8f75-8248f407be16\") " pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:46 crc kubenswrapper[4715]: I1009 07:59:46.905562 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3220b54-5aa6-4d89-8f75-8248f407be16-catalog-content\") pod \"certified-operators-x6x5c\" (UID: \"a3220b54-5aa6-4d89-8f75-8248f407be16\") " pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:47 crc kubenswrapper[4715]: I1009 07:59:47.007117 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3220b54-5aa6-4d89-8f75-8248f407be16-catalog-content\") pod \"certified-operators-x6x5c\" (UID: \"a3220b54-5aa6-4d89-8f75-8248f407be16\") " pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:47 crc kubenswrapper[4715]: I1009 07:59:47.007236 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cj9t\" (UniqueName: \"kubernetes.io/projected/a3220b54-5aa6-4d89-8f75-8248f407be16-kube-api-access-7cj9t\") pod \"certified-operators-x6x5c\" (UID: \"a3220b54-5aa6-4d89-8f75-8248f407be16\") " pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:47 crc kubenswrapper[4715]: I1009 07:59:47.007266 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3220b54-5aa6-4d89-8f75-8248f407be16-utilities\") pod \"certified-operators-x6x5c\" (UID: \"a3220b54-5aa6-4d89-8f75-8248f407be16\") " pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:47 crc kubenswrapper[4715]: I1009 07:59:47.007706 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3220b54-5aa6-4d89-8f75-8248f407be16-catalog-content\") pod \"certified-operators-x6x5c\" (UID: \"a3220b54-5aa6-4d89-8f75-8248f407be16\") " pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:47 crc kubenswrapper[4715]: I1009 07:59:47.007832 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3220b54-5aa6-4d89-8f75-8248f407be16-utilities\") pod \"certified-operators-x6x5c\" (UID: \"a3220b54-5aa6-4d89-8f75-8248f407be16\") " pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:47 crc kubenswrapper[4715]: I1009 07:59:47.031913 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cj9t\" (UniqueName: \"kubernetes.io/projected/a3220b54-5aa6-4d89-8f75-8248f407be16-kube-api-access-7cj9t\") pod \"certified-operators-x6x5c\" (UID: \"a3220b54-5aa6-4d89-8f75-8248f407be16\") " pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:47 crc kubenswrapper[4715]: I1009 07:59:47.138215 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="451f2195b7f62aab0dad39fca7efb143fc4db9dbd4af35f5a099bbb88635e621" exitCode=0 Oct 09 07:59:47 crc kubenswrapper[4715]: I1009 07:59:47.138276 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"451f2195b7f62aab0dad39fca7efb143fc4db9dbd4af35f5a099bbb88635e621"} Oct 09 07:59:47 crc kubenswrapper[4715]: I1009 07:59:47.138577 4715 scope.go:117] "RemoveContainer" containerID="a6cfe3d63903269fab164da1154df39ba0aa750858dad3414bb1690252e4ef7d" Oct 09 07:59:47 crc kubenswrapper[4715]: I1009 07:59:47.201300 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:47 crc kubenswrapper[4715]: I1009 07:59:47.674075 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6x5c"] Oct 09 07:59:47 crc kubenswrapper[4715]: W1009 07:59:47.680644 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3220b54_5aa6_4d89_8f75_8248f407be16.slice/crio-5884cfe3d3d77ea90a61ae8f394a475136cacbddc3ee95d6e53e045fb5b39b9f WatchSource:0}: Error finding container 5884cfe3d3d77ea90a61ae8f394a475136cacbddc3ee95d6e53e045fb5b39b9f: Status 404 returned error can't find the container with id 5884cfe3d3d77ea90a61ae8f394a475136cacbddc3ee95d6e53e045fb5b39b9f Oct 09 07:59:48 crc kubenswrapper[4715]: I1009 07:59:48.147072 4715 generic.go:334] "Generic (PLEG): container finished" podID="a3220b54-5aa6-4d89-8f75-8248f407be16" containerID="9ce4b108d66bf9c28b5d28e257fad40fd49afc5327d5dca508bc4870ae80d311" exitCode=0 Oct 09 07:59:48 crc kubenswrapper[4715]: I1009 07:59:48.147159 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x5c" event={"ID":"a3220b54-5aa6-4d89-8f75-8248f407be16","Type":"ContainerDied","Data":"9ce4b108d66bf9c28b5d28e257fad40fd49afc5327d5dca508bc4870ae80d311"} Oct 09 07:59:48 crc kubenswrapper[4715]: I1009 07:59:48.147545 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x5c" event={"ID":"a3220b54-5aa6-4d89-8f75-8248f407be16","Type":"ContainerStarted","Data":"5884cfe3d3d77ea90a61ae8f394a475136cacbddc3ee95d6e53e045fb5b39b9f"} Oct 09 07:59:48 crc kubenswrapper[4715]: I1009 07:59:48.151132 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"d09013bd08005ad32fff769feb782bd4aaed730f81b53c0815c16cfe4fba1a84"} Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.083202 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl"] Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.084362 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.086275 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-z9v5v" Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.097547 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl"] Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.145630 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ztbh\" (UniqueName: \"kubernetes.io/projected/3f089e5d-d22d-45bd-8525-ff337f7db321-kube-api-access-8ztbh\") pod \"4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl\" (UID: \"3f089e5d-d22d-45bd-8525-ff337f7db321\") " pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.145699 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f089e5d-d22d-45bd-8525-ff337f7db321-util\") pod \"4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl\" (UID: \"3f089e5d-d22d-45bd-8525-ff337f7db321\") " pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.145724 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f089e5d-d22d-45bd-8525-ff337f7db321-bundle\") pod \"4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl\" (UID: \"3f089e5d-d22d-45bd-8525-ff337f7db321\") " pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.247782 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ztbh\" (UniqueName: \"kubernetes.io/projected/3f089e5d-d22d-45bd-8525-ff337f7db321-kube-api-access-8ztbh\") pod \"4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl\" (UID: \"3f089e5d-d22d-45bd-8525-ff337f7db321\") " pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.247989 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f089e5d-d22d-45bd-8525-ff337f7db321-util\") pod \"4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl\" (UID: \"3f089e5d-d22d-45bd-8525-ff337f7db321\") " pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.248057 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f089e5d-d22d-45bd-8525-ff337f7db321-bundle\") pod \"4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl\" (UID: \"3f089e5d-d22d-45bd-8525-ff337f7db321\") " pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.248721 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f089e5d-d22d-45bd-8525-ff337f7db321-bundle\") pod \"4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl\" (UID: \"3f089e5d-d22d-45bd-8525-ff337f7db321\") " pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.248834 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f089e5d-d22d-45bd-8525-ff337f7db321-util\") pod \"4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl\" (UID: \"3f089e5d-d22d-45bd-8525-ff337f7db321\") " pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.267389 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ztbh\" (UniqueName: \"kubernetes.io/projected/3f089e5d-d22d-45bd-8525-ff337f7db321-kube-api-access-8ztbh\") pod \"4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl\" (UID: \"3f089e5d-d22d-45bd-8525-ff337f7db321\") " pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.407774 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.809911 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl"] Oct 09 07:59:49 crc kubenswrapper[4715]: W1009 07:59:49.818327 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f089e5d_d22d_45bd_8525_ff337f7db321.slice/crio-84c3d3cc63af82382d32eafc0df38c80bd5735cf79b92c373ac9af90b838d40c WatchSource:0}: Error finding container 84c3d3cc63af82382d32eafc0df38c80bd5735cf79b92c373ac9af90b838d40c: Status 404 returned error can't find the container with id 84c3d3cc63af82382d32eafc0df38c80bd5735cf79b92c373ac9af90b838d40c Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.843586 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwpxk"] Oct 09 07:59:49 crc kubenswrapper[4715]: I1009 07:59:49.843852 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zwpxk" podUID="89e46f04-087a-4d41-bfa2-478dc342b0cf" containerName="registry-server" containerID="cri-o://3c40c81f446bd561905e2d85a273be8b4840e645767e534a0d5763fa2ed59f61" gracePeriod=2 Oct 09 07:59:49 crc kubenswrapper[4715]: E1009 07:59:49.979442 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89e46f04_087a_4d41_bfa2_478dc342b0cf.slice/crio-conmon-3c40c81f446bd561905e2d85a273be8b4840e645767e534a0d5763fa2ed59f61.scope\": RecentStats: unable to find data in memory cache]" Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.169143 4715 generic.go:334] "Generic (PLEG): container finished" podID="89e46f04-087a-4d41-bfa2-478dc342b0cf" containerID="3c40c81f446bd561905e2d85a273be8b4840e645767e534a0d5763fa2ed59f61" exitCode=0 Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.169489 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwpxk" event={"ID":"89e46f04-087a-4d41-bfa2-478dc342b0cf","Type":"ContainerDied","Data":"3c40c81f446bd561905e2d85a273be8b4840e645767e534a0d5763fa2ed59f61"} Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.171282 4715 generic.go:334] "Generic (PLEG): container finished" podID="a3220b54-5aa6-4d89-8f75-8248f407be16" containerID="484ce8e0e74f9c8ba90c7fade0def88815dcd5a4e9d68ba002e4ed8a9d1f3aec" exitCode=0 Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.171368 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x5c" event={"ID":"a3220b54-5aa6-4d89-8f75-8248f407be16","Type":"ContainerDied","Data":"484ce8e0e74f9c8ba90c7fade0def88815dcd5a4e9d68ba002e4ed8a9d1f3aec"} Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.173637 4715 generic.go:334] "Generic (PLEG): container finished" podID="3f089e5d-d22d-45bd-8525-ff337f7db321" containerID="b11416d9b21202bb2e18ea0c043a32e15943a6e44a932e479d4e167330b6e307" exitCode=0 Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.173664 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" event={"ID":"3f089e5d-d22d-45bd-8525-ff337f7db321","Type":"ContainerDied","Data":"b11416d9b21202bb2e18ea0c043a32e15943a6e44a932e479d4e167330b6e307"} Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.173682 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" event={"ID":"3f089e5d-d22d-45bd-8525-ff337f7db321","Type":"ContainerStarted","Data":"84c3d3cc63af82382d32eafc0df38c80bd5735cf79b92c373ac9af90b838d40c"} Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.371094 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.466939 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89e46f04-087a-4d41-bfa2-478dc342b0cf-catalog-content\") pod \"89e46f04-087a-4d41-bfa2-478dc342b0cf\" (UID: \"89e46f04-087a-4d41-bfa2-478dc342b0cf\") " Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.467402 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9vp5\" (UniqueName: \"kubernetes.io/projected/89e46f04-087a-4d41-bfa2-478dc342b0cf-kube-api-access-z9vp5\") pod \"89e46f04-087a-4d41-bfa2-478dc342b0cf\" (UID: \"89e46f04-087a-4d41-bfa2-478dc342b0cf\") " Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.467557 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89e46f04-087a-4d41-bfa2-478dc342b0cf-utilities\") pod \"89e46f04-087a-4d41-bfa2-478dc342b0cf\" (UID: \"89e46f04-087a-4d41-bfa2-478dc342b0cf\") " Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.468709 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89e46f04-087a-4d41-bfa2-478dc342b0cf-utilities" (OuterVolumeSpecName: "utilities") pod "89e46f04-087a-4d41-bfa2-478dc342b0cf" (UID: "89e46f04-087a-4d41-bfa2-478dc342b0cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.474725 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89e46f04-087a-4d41-bfa2-478dc342b0cf-kube-api-access-z9vp5" (OuterVolumeSpecName: "kube-api-access-z9vp5") pod "89e46f04-087a-4d41-bfa2-478dc342b0cf" (UID: "89e46f04-087a-4d41-bfa2-478dc342b0cf"). InnerVolumeSpecName "kube-api-access-z9vp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.548695 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89e46f04-087a-4d41-bfa2-478dc342b0cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89e46f04-087a-4d41-bfa2-478dc342b0cf" (UID: "89e46f04-087a-4d41-bfa2-478dc342b0cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.569759 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89e46f04-087a-4d41-bfa2-478dc342b0cf-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.569796 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89e46f04-087a-4d41-bfa2-478dc342b0cf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 07:59:50 crc kubenswrapper[4715]: I1009 07:59:50.569812 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9vp5\" (UniqueName: \"kubernetes.io/projected/89e46f04-087a-4d41-bfa2-478dc342b0cf-kube-api-access-z9vp5\") on node \"crc\" DevicePath \"\"" Oct 09 07:59:51 crc kubenswrapper[4715]: I1009 07:59:51.186926 4715 generic.go:334] "Generic (PLEG): container finished" podID="3f089e5d-d22d-45bd-8525-ff337f7db321" containerID="849c0ef7bf0399cd8e7ce4e6780c9dc8cbf1193feab98398aa66202841b1be79" exitCode=0 Oct 09 07:59:51 crc kubenswrapper[4715]: I1009 07:59:51.187034 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" event={"ID":"3f089e5d-d22d-45bd-8525-ff337f7db321","Type":"ContainerDied","Data":"849c0ef7bf0399cd8e7ce4e6780c9dc8cbf1193feab98398aa66202841b1be79"} Oct 09 07:59:51 crc kubenswrapper[4715]: I1009 07:59:51.189946 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwpxk" event={"ID":"89e46f04-087a-4d41-bfa2-478dc342b0cf","Type":"ContainerDied","Data":"6de9abbc7d4cf13241fc4ee955ff8d25165a0e2f251ec9c70a3f6ec3e1e495da"} Oct 09 07:59:51 crc kubenswrapper[4715]: I1009 07:59:51.189996 4715 scope.go:117] "RemoveContainer" containerID="3c40c81f446bd561905e2d85a273be8b4840e645767e534a0d5763fa2ed59f61" Oct 09 07:59:51 crc kubenswrapper[4715]: I1009 07:59:51.190012 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwpxk" Oct 09 07:59:51 crc kubenswrapper[4715]: I1009 07:59:51.194510 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x5c" event={"ID":"a3220b54-5aa6-4d89-8f75-8248f407be16","Type":"ContainerStarted","Data":"0b0f1a6705778ca5b91ae3d341a19fd76b2eda2c825654ac7ee120d896c4b22b"} Oct 09 07:59:51 crc kubenswrapper[4715]: I1009 07:59:51.226709 4715 scope.go:117] "RemoveContainer" containerID="77476b33a91389b160966b928d269efd146fd484cfe871167a4f535eac45f3e2" Oct 09 07:59:51 crc kubenswrapper[4715]: I1009 07:59:51.228445 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwpxk"] Oct 09 07:59:51 crc kubenswrapper[4715]: I1009 07:59:51.231809 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zwpxk"] Oct 09 07:59:51 crc kubenswrapper[4715]: I1009 07:59:51.238179 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x6x5c" podStartSLOduration=2.741578485 podStartE2EDuration="5.238153298s" podCreationTimestamp="2025-10-09 07:59:46 +0000 UTC" firstStartedPulling="2025-10-09 07:59:48.149159459 +0000 UTC m=+818.841963477" lastFinishedPulling="2025-10-09 07:59:50.645734282 +0000 UTC m=+821.338538290" observedRunningTime="2025-10-09 07:59:51.235962654 +0000 UTC m=+821.928766672" watchObservedRunningTime="2025-10-09 07:59:51.238153298 +0000 UTC m=+821.930957316" Oct 09 07:59:51 crc kubenswrapper[4715]: I1009 07:59:51.256752 4715 scope.go:117] "RemoveContainer" containerID="f6dd6dba7e3d77672dcac98d7d3ff54cb43c57d7ee2059441a42474e4912cfc1" Oct 09 07:59:52 crc kubenswrapper[4715]: I1009 07:59:52.142494 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89e46f04-087a-4d41-bfa2-478dc342b0cf" path="/var/lib/kubelet/pods/89e46f04-087a-4d41-bfa2-478dc342b0cf/volumes" Oct 09 07:59:52 crc kubenswrapper[4715]: I1009 07:59:52.201167 4715 generic.go:334] "Generic (PLEG): container finished" podID="3f089e5d-d22d-45bd-8525-ff337f7db321" containerID="d105fc851ad1e9926650a45f4bcfa5d173db25d530e22283e672f90b142d2057" exitCode=0 Oct 09 07:59:52 crc kubenswrapper[4715]: I1009 07:59:52.201247 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" event={"ID":"3f089e5d-d22d-45bd-8525-ff337f7db321","Type":"ContainerDied","Data":"d105fc851ad1e9926650a45f4bcfa5d173db25d530e22283e672f90b142d2057"} Oct 09 07:59:53 crc kubenswrapper[4715]: I1009 07:59:53.500740 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:53 crc kubenswrapper[4715]: I1009 07:59:53.655497 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f089e5d-d22d-45bd-8525-ff337f7db321-util\") pod \"3f089e5d-d22d-45bd-8525-ff337f7db321\" (UID: \"3f089e5d-d22d-45bd-8525-ff337f7db321\") " Oct 09 07:59:53 crc kubenswrapper[4715]: I1009 07:59:53.655554 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f089e5d-d22d-45bd-8525-ff337f7db321-bundle\") pod \"3f089e5d-d22d-45bd-8525-ff337f7db321\" (UID: \"3f089e5d-d22d-45bd-8525-ff337f7db321\") " Oct 09 07:59:53 crc kubenswrapper[4715]: I1009 07:59:53.655693 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ztbh\" (UniqueName: \"kubernetes.io/projected/3f089e5d-d22d-45bd-8525-ff337f7db321-kube-api-access-8ztbh\") pod \"3f089e5d-d22d-45bd-8525-ff337f7db321\" (UID: \"3f089e5d-d22d-45bd-8525-ff337f7db321\") " Oct 09 07:59:53 crc kubenswrapper[4715]: I1009 07:59:53.656573 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f089e5d-d22d-45bd-8525-ff337f7db321-bundle" (OuterVolumeSpecName: "bundle") pod "3f089e5d-d22d-45bd-8525-ff337f7db321" (UID: "3f089e5d-d22d-45bd-8525-ff337f7db321"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:59:53 crc kubenswrapper[4715]: I1009 07:59:53.664593 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f089e5d-d22d-45bd-8525-ff337f7db321-kube-api-access-8ztbh" (OuterVolumeSpecName: "kube-api-access-8ztbh") pod "3f089e5d-d22d-45bd-8525-ff337f7db321" (UID: "3f089e5d-d22d-45bd-8525-ff337f7db321"). InnerVolumeSpecName "kube-api-access-8ztbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 07:59:53 crc kubenswrapper[4715]: I1009 07:59:53.669741 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f089e5d-d22d-45bd-8525-ff337f7db321-util" (OuterVolumeSpecName: "util") pod "3f089e5d-d22d-45bd-8525-ff337f7db321" (UID: "3f089e5d-d22d-45bd-8525-ff337f7db321"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 07:59:53 crc kubenswrapper[4715]: I1009 07:59:53.757357 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ztbh\" (UniqueName: \"kubernetes.io/projected/3f089e5d-d22d-45bd-8525-ff337f7db321-kube-api-access-8ztbh\") on node \"crc\" DevicePath \"\"" Oct 09 07:59:53 crc kubenswrapper[4715]: I1009 07:59:53.757398 4715 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3f089e5d-d22d-45bd-8525-ff337f7db321-util\") on node \"crc\" DevicePath \"\"" Oct 09 07:59:53 crc kubenswrapper[4715]: I1009 07:59:53.757409 4715 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3f089e5d-d22d-45bd-8525-ff337f7db321-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 07:59:54 crc kubenswrapper[4715]: I1009 07:59:54.219552 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" event={"ID":"3f089e5d-d22d-45bd-8525-ff337f7db321","Type":"ContainerDied","Data":"84c3d3cc63af82382d32eafc0df38c80bd5735cf79b92c373ac9af90b838d40c"} Oct 09 07:59:54 crc kubenswrapper[4715]: I1009 07:59:54.219599 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84c3d3cc63af82382d32eafc0df38c80bd5735cf79b92c373ac9af90b838d40c" Oct 09 07:59:54 crc kubenswrapper[4715]: I1009 07:59:54.219662 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl" Oct 09 07:59:57 crc kubenswrapper[4715]: I1009 07:59:57.202486 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:57 crc kubenswrapper[4715]: I1009 07:59:57.202827 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:57 crc kubenswrapper[4715]: I1009 07:59:57.251936 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:57 crc kubenswrapper[4715]: I1009 07:59:57.308012 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.232798 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4"] Oct 09 07:59:59 crc kubenswrapper[4715]: E1009 07:59:59.233504 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89e46f04-087a-4d41-bfa2-478dc342b0cf" containerName="extract-utilities" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.233523 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e46f04-087a-4d41-bfa2-478dc342b0cf" containerName="extract-utilities" Oct 09 07:59:59 crc kubenswrapper[4715]: E1009 07:59:59.233538 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89e46f04-087a-4d41-bfa2-478dc342b0cf" containerName="registry-server" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.233549 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e46f04-087a-4d41-bfa2-478dc342b0cf" containerName="registry-server" Oct 09 07:59:59 crc kubenswrapper[4715]: E1009 07:59:59.233561 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f089e5d-d22d-45bd-8525-ff337f7db321" containerName="pull" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.233572 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f089e5d-d22d-45bd-8525-ff337f7db321" containerName="pull" Oct 09 07:59:59 crc kubenswrapper[4715]: E1009 07:59:59.233592 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89e46f04-087a-4d41-bfa2-478dc342b0cf" containerName="extract-content" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.233599 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e46f04-087a-4d41-bfa2-478dc342b0cf" containerName="extract-content" Oct 09 07:59:59 crc kubenswrapper[4715]: E1009 07:59:59.233617 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f089e5d-d22d-45bd-8525-ff337f7db321" containerName="extract" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.233625 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f089e5d-d22d-45bd-8525-ff337f7db321" containerName="extract" Oct 09 07:59:59 crc kubenswrapper[4715]: E1009 07:59:59.233635 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f089e5d-d22d-45bd-8525-ff337f7db321" containerName="util" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.233642 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f089e5d-d22d-45bd-8525-ff337f7db321" containerName="util" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.233783 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f089e5d-d22d-45bd-8525-ff337f7db321" containerName="extract" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.233806 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="89e46f04-087a-4d41-bfa2-478dc342b0cf" containerName="registry-server" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.234667 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.236345 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-x9g57" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.306337 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4"] Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.360031 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngq7h\" (UniqueName: \"kubernetes.io/projected/2f70ba87-a4dd-4a97-a005-f63fec497e9f-kube-api-access-ngq7h\") pod \"openstack-operator-controller-operator-75c7986888-vmsr4\" (UID: \"2f70ba87-a4dd-4a97-a005-f63fec497e9f\") " pod="openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.461726 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngq7h\" (UniqueName: \"kubernetes.io/projected/2f70ba87-a4dd-4a97-a005-f63fec497e9f-kube-api-access-ngq7h\") pod \"openstack-operator-controller-operator-75c7986888-vmsr4\" (UID: \"2f70ba87-a4dd-4a97-a005-f63fec497e9f\") " pod="openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.490108 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngq7h\" (UniqueName: \"kubernetes.io/projected/2f70ba87-a4dd-4a97-a005-f63fec497e9f-kube-api-access-ngq7h\") pod \"openstack-operator-controller-operator-75c7986888-vmsr4\" (UID: \"2f70ba87-a4dd-4a97-a005-f63fec497e9f\") " pod="openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.556918 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4" Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.645321 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6x5c"] Oct 09 07:59:59 crc kubenswrapper[4715]: I1009 07:59:59.645810 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x6x5c" podUID="a3220b54-5aa6-4d89-8f75-8248f407be16" containerName="registry-server" containerID="cri-o://0b0f1a6705778ca5b91ae3d341a19fd76b2eda2c825654ac7ee120d896c4b22b" gracePeriod=2 Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.069739 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4"] Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.163999 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l"] Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.164791 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.167088 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.177326 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.183877 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l"] Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.209141 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d35a0e3-8f99-408e-9f03-eb29d705d730-secret-volume\") pod \"collect-profiles-29333280-zqp6l\" (UID: \"4d35a0e3-8f99-408e-9f03-eb29d705d730\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.209249 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htlb5\" (UniqueName: \"kubernetes.io/projected/4d35a0e3-8f99-408e-9f03-eb29d705d730-kube-api-access-htlb5\") pod \"collect-profiles-29333280-zqp6l\" (UID: \"4d35a0e3-8f99-408e-9f03-eb29d705d730\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.209276 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d35a0e3-8f99-408e-9f03-eb29d705d730-config-volume\") pod \"collect-profiles-29333280-zqp6l\" (UID: \"4d35a0e3-8f99-408e-9f03-eb29d705d730\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.269190 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4" event={"ID":"2f70ba87-a4dd-4a97-a005-f63fec497e9f","Type":"ContainerStarted","Data":"fa217ac30720af23b61246e4ce34f99a29d88acf9395a37dfe8455d66c86dd1e"} Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.272683 4715 generic.go:334] "Generic (PLEG): container finished" podID="a3220b54-5aa6-4d89-8f75-8248f407be16" containerID="0b0f1a6705778ca5b91ae3d341a19fd76b2eda2c825654ac7ee120d896c4b22b" exitCode=0 Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.272724 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x5c" event={"ID":"a3220b54-5aa6-4d89-8f75-8248f407be16","Type":"ContainerDied","Data":"0b0f1a6705778ca5b91ae3d341a19fd76b2eda2c825654ac7ee120d896c4b22b"} Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.310189 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htlb5\" (UniqueName: \"kubernetes.io/projected/4d35a0e3-8f99-408e-9f03-eb29d705d730-kube-api-access-htlb5\") pod \"collect-profiles-29333280-zqp6l\" (UID: \"4d35a0e3-8f99-408e-9f03-eb29d705d730\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.310257 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d35a0e3-8f99-408e-9f03-eb29d705d730-config-volume\") pod \"collect-profiles-29333280-zqp6l\" (UID: \"4d35a0e3-8f99-408e-9f03-eb29d705d730\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.310363 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d35a0e3-8f99-408e-9f03-eb29d705d730-secret-volume\") pod \"collect-profiles-29333280-zqp6l\" (UID: \"4d35a0e3-8f99-408e-9f03-eb29d705d730\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.312519 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d35a0e3-8f99-408e-9f03-eb29d705d730-config-volume\") pod \"collect-profiles-29333280-zqp6l\" (UID: \"4d35a0e3-8f99-408e-9f03-eb29d705d730\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.323053 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d35a0e3-8f99-408e-9f03-eb29d705d730-secret-volume\") pod \"collect-profiles-29333280-zqp6l\" (UID: \"4d35a0e3-8f99-408e-9f03-eb29d705d730\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.326395 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htlb5\" (UniqueName: \"kubernetes.io/projected/4d35a0e3-8f99-408e-9f03-eb29d705d730-kube-api-access-htlb5\") pod \"collect-profiles-29333280-zqp6l\" (UID: \"4d35a0e3-8f99-408e-9f03-eb29d705d730\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.493707 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.723999 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.817990 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cj9t\" (UniqueName: \"kubernetes.io/projected/a3220b54-5aa6-4d89-8f75-8248f407be16-kube-api-access-7cj9t\") pod \"a3220b54-5aa6-4d89-8f75-8248f407be16\" (UID: \"a3220b54-5aa6-4d89-8f75-8248f407be16\") " Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.818144 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3220b54-5aa6-4d89-8f75-8248f407be16-catalog-content\") pod \"a3220b54-5aa6-4d89-8f75-8248f407be16\" (UID: \"a3220b54-5aa6-4d89-8f75-8248f407be16\") " Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.818254 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3220b54-5aa6-4d89-8f75-8248f407be16-utilities\") pod \"a3220b54-5aa6-4d89-8f75-8248f407be16\" (UID: \"a3220b54-5aa6-4d89-8f75-8248f407be16\") " Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.819115 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3220b54-5aa6-4d89-8f75-8248f407be16-utilities" (OuterVolumeSpecName: "utilities") pod "a3220b54-5aa6-4d89-8f75-8248f407be16" (UID: "a3220b54-5aa6-4d89-8f75-8248f407be16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.823151 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3220b54-5aa6-4d89-8f75-8248f407be16-kube-api-access-7cj9t" (OuterVolumeSpecName: "kube-api-access-7cj9t") pod "a3220b54-5aa6-4d89-8f75-8248f407be16" (UID: "a3220b54-5aa6-4d89-8f75-8248f407be16"). InnerVolumeSpecName "kube-api-access-7cj9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.872002 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3220b54-5aa6-4d89-8f75-8248f407be16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3220b54-5aa6-4d89-8f75-8248f407be16" (UID: "a3220b54-5aa6-4d89-8f75-8248f407be16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.920501 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3220b54-5aa6-4d89-8f75-8248f407be16-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.920539 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cj9t\" (UniqueName: \"kubernetes.io/projected/a3220b54-5aa6-4d89-8f75-8248f407be16-kube-api-access-7cj9t\") on node \"crc\" DevicePath \"\"" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.920557 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3220b54-5aa6-4d89-8f75-8248f407be16-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:00:00 crc kubenswrapper[4715]: I1009 08:00:00.955178 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l"] Oct 09 08:00:00 crc kubenswrapper[4715]: W1009 08:00:00.962984 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d35a0e3_8f99_408e_9f03_eb29d705d730.slice/crio-d525372851f9f03013497cdcedad69753b2be1db3ca67f737bda88894e5a6361 WatchSource:0}: Error finding container d525372851f9f03013497cdcedad69753b2be1db3ca67f737bda88894e5a6361: Status 404 returned error can't find the container with id d525372851f9f03013497cdcedad69753b2be1db3ca67f737bda88894e5a6361 Oct 09 08:00:01 crc kubenswrapper[4715]: I1009 08:00:01.293221 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" event={"ID":"4d35a0e3-8f99-408e-9f03-eb29d705d730","Type":"ContainerStarted","Data":"bd8419cb68b37888d8ef166693a5caa4076baef592a79621b4ffcf8238f4694c"} Oct 09 08:00:01 crc kubenswrapper[4715]: I1009 08:00:01.293575 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" event={"ID":"4d35a0e3-8f99-408e-9f03-eb29d705d730","Type":"ContainerStarted","Data":"d525372851f9f03013497cdcedad69753b2be1db3ca67f737bda88894e5a6361"} Oct 09 08:00:01 crc kubenswrapper[4715]: I1009 08:00:01.326314 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6x5c" event={"ID":"a3220b54-5aa6-4d89-8f75-8248f407be16","Type":"ContainerDied","Data":"5884cfe3d3d77ea90a61ae8f394a475136cacbddc3ee95d6e53e045fb5b39b9f"} Oct 09 08:00:01 crc kubenswrapper[4715]: I1009 08:00:01.326396 4715 scope.go:117] "RemoveContainer" containerID="0b0f1a6705778ca5b91ae3d341a19fd76b2eda2c825654ac7ee120d896c4b22b" Oct 09 08:00:01 crc kubenswrapper[4715]: I1009 08:00:01.326667 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6x5c" Oct 09 08:00:01 crc kubenswrapper[4715]: I1009 08:00:01.331336 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" podStartSLOduration=1.331318916 podStartE2EDuration="1.331318916s" podCreationTimestamp="2025-10-09 08:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:00:01.327395462 +0000 UTC m=+832.020199480" watchObservedRunningTime="2025-10-09 08:00:01.331318916 +0000 UTC m=+832.024122924" Oct 09 08:00:01 crc kubenswrapper[4715]: I1009 08:00:01.361045 4715 scope.go:117] "RemoveContainer" containerID="484ce8e0e74f9c8ba90c7fade0def88815dcd5a4e9d68ba002e4ed8a9d1f3aec" Oct 09 08:00:01 crc kubenswrapper[4715]: I1009 08:00:01.376310 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6x5c"] Oct 09 08:00:01 crc kubenswrapper[4715]: I1009 08:00:01.385167 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x6x5c"] Oct 09 08:00:01 crc kubenswrapper[4715]: I1009 08:00:01.396171 4715 scope.go:117] "RemoveContainer" containerID="9ce4b108d66bf9c28b5d28e257fad40fd49afc5327d5dca508bc4870ae80d311" Oct 09 08:00:02 crc kubenswrapper[4715]: I1009 08:00:02.145878 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3220b54-5aa6-4d89-8f75-8248f407be16" path="/var/lib/kubelet/pods/a3220b54-5aa6-4d89-8f75-8248f407be16/volumes" Oct 09 08:00:02 crc kubenswrapper[4715]: I1009 08:00:02.344358 4715 generic.go:334] "Generic (PLEG): container finished" podID="4d35a0e3-8f99-408e-9f03-eb29d705d730" containerID="bd8419cb68b37888d8ef166693a5caa4076baef592a79621b4ffcf8238f4694c" exitCode=0 Oct 09 08:00:02 crc kubenswrapper[4715]: I1009 08:00:02.344457 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" event={"ID":"4d35a0e3-8f99-408e-9f03-eb29d705d730","Type":"ContainerDied","Data":"bd8419cb68b37888d8ef166693a5caa4076baef592a79621b4ffcf8238f4694c"} Oct 09 08:00:03 crc kubenswrapper[4715]: I1009 08:00:03.851444 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2mwsn"] Oct 09 08:00:03 crc kubenswrapper[4715]: E1009 08:00:03.852690 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3220b54-5aa6-4d89-8f75-8248f407be16" containerName="extract-content" Oct 09 08:00:03 crc kubenswrapper[4715]: I1009 08:00:03.852786 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3220b54-5aa6-4d89-8f75-8248f407be16" containerName="extract-content" Oct 09 08:00:03 crc kubenswrapper[4715]: E1009 08:00:03.852803 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3220b54-5aa6-4d89-8f75-8248f407be16" containerName="extract-utilities" Oct 09 08:00:03 crc kubenswrapper[4715]: I1009 08:00:03.852810 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3220b54-5aa6-4d89-8f75-8248f407be16" containerName="extract-utilities" Oct 09 08:00:03 crc kubenswrapper[4715]: E1009 08:00:03.852858 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3220b54-5aa6-4d89-8f75-8248f407be16" containerName="registry-server" Oct 09 08:00:03 crc kubenswrapper[4715]: I1009 08:00:03.852866 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3220b54-5aa6-4d89-8f75-8248f407be16" containerName="registry-server" Oct 09 08:00:03 crc kubenswrapper[4715]: I1009 08:00:03.853005 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3220b54-5aa6-4d89-8f75-8248f407be16" containerName="registry-server" Oct 09 08:00:03 crc kubenswrapper[4715]: I1009 08:00:03.854892 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:03 crc kubenswrapper[4715]: I1009 08:00:03.862648 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2mwsn"] Oct 09 08:00:03 crc kubenswrapper[4715]: I1009 08:00:03.962538 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf03b75-e185-426f-8c78-177b3514ff44-utilities\") pod \"community-operators-2mwsn\" (UID: \"ebf03b75-e185-426f-8c78-177b3514ff44\") " pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:03 crc kubenswrapper[4715]: I1009 08:00:03.963046 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t245w\" (UniqueName: \"kubernetes.io/projected/ebf03b75-e185-426f-8c78-177b3514ff44-kube-api-access-t245w\") pod \"community-operators-2mwsn\" (UID: \"ebf03b75-e185-426f-8c78-177b3514ff44\") " pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:03 crc kubenswrapper[4715]: I1009 08:00:03.963115 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf03b75-e185-426f-8c78-177b3514ff44-catalog-content\") pod \"community-operators-2mwsn\" (UID: \"ebf03b75-e185-426f-8c78-177b3514ff44\") " pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:03 crc kubenswrapper[4715]: I1009 08:00:03.988145 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.064475 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htlb5\" (UniqueName: \"kubernetes.io/projected/4d35a0e3-8f99-408e-9f03-eb29d705d730-kube-api-access-htlb5\") pod \"4d35a0e3-8f99-408e-9f03-eb29d705d730\" (UID: \"4d35a0e3-8f99-408e-9f03-eb29d705d730\") " Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.064547 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d35a0e3-8f99-408e-9f03-eb29d705d730-secret-volume\") pod \"4d35a0e3-8f99-408e-9f03-eb29d705d730\" (UID: \"4d35a0e3-8f99-408e-9f03-eb29d705d730\") " Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.064684 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d35a0e3-8f99-408e-9f03-eb29d705d730-config-volume\") pod \"4d35a0e3-8f99-408e-9f03-eb29d705d730\" (UID: \"4d35a0e3-8f99-408e-9f03-eb29d705d730\") " Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.064870 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t245w\" (UniqueName: \"kubernetes.io/projected/ebf03b75-e185-426f-8c78-177b3514ff44-kube-api-access-t245w\") pod \"community-operators-2mwsn\" (UID: \"ebf03b75-e185-426f-8c78-177b3514ff44\") " pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.064918 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf03b75-e185-426f-8c78-177b3514ff44-catalog-content\") pod \"community-operators-2mwsn\" (UID: \"ebf03b75-e185-426f-8c78-177b3514ff44\") " pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.064962 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf03b75-e185-426f-8c78-177b3514ff44-utilities\") pod \"community-operators-2mwsn\" (UID: \"ebf03b75-e185-426f-8c78-177b3514ff44\") " pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.065456 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf03b75-e185-426f-8c78-177b3514ff44-utilities\") pod \"community-operators-2mwsn\" (UID: \"ebf03b75-e185-426f-8c78-177b3514ff44\") " pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.065904 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d35a0e3-8f99-408e-9f03-eb29d705d730-config-volume" (OuterVolumeSpecName: "config-volume") pod "4d35a0e3-8f99-408e-9f03-eb29d705d730" (UID: "4d35a0e3-8f99-408e-9f03-eb29d705d730"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.066095 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf03b75-e185-426f-8c78-177b3514ff44-catalog-content\") pod \"community-operators-2mwsn\" (UID: \"ebf03b75-e185-426f-8c78-177b3514ff44\") " pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.070095 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d35a0e3-8f99-408e-9f03-eb29d705d730-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4d35a0e3-8f99-408e-9f03-eb29d705d730" (UID: "4d35a0e3-8f99-408e-9f03-eb29d705d730"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.076368 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d35a0e3-8f99-408e-9f03-eb29d705d730-kube-api-access-htlb5" (OuterVolumeSpecName: "kube-api-access-htlb5") pod "4d35a0e3-8f99-408e-9f03-eb29d705d730" (UID: "4d35a0e3-8f99-408e-9f03-eb29d705d730"). InnerVolumeSpecName "kube-api-access-htlb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.085852 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t245w\" (UniqueName: \"kubernetes.io/projected/ebf03b75-e185-426f-8c78-177b3514ff44-kube-api-access-t245w\") pod \"community-operators-2mwsn\" (UID: \"ebf03b75-e185-426f-8c78-177b3514ff44\") " pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.166342 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htlb5\" (UniqueName: \"kubernetes.io/projected/4d35a0e3-8f99-408e-9f03-eb29d705d730-kube-api-access-htlb5\") on node \"crc\" DevicePath \"\"" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.166388 4715 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d35a0e3-8f99-408e-9f03-eb29d705d730-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.166402 4715 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d35a0e3-8f99-408e-9f03-eb29d705d730-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.185805 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.374839 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4" event={"ID":"2f70ba87-a4dd-4a97-a005-f63fec497e9f","Type":"ContainerStarted","Data":"8ca1aaee1592b96a3803698ab79e388d272d20a112b7fccf249bbfe2cc2c8f8f"} Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.389918 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" event={"ID":"4d35a0e3-8f99-408e-9f03-eb29d705d730","Type":"ContainerDied","Data":"d525372851f9f03013497cdcedad69753b2be1db3ca67f737bda88894e5a6361"} Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.389961 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d525372851f9f03013497cdcedad69753b2be1db3ca67f737bda88894e5a6361" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.390037 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l" Oct 09 08:00:04 crc kubenswrapper[4715]: I1009 08:00:04.691771 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2mwsn"] Oct 09 08:00:04 crc kubenswrapper[4715]: W1009 08:00:04.697121 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebf03b75_e185_426f_8c78_177b3514ff44.slice/crio-5f40631d84a4af9952e842c4288badc9349e829ac6a7ffc015edb20a473d9f11 WatchSource:0}: Error finding container 5f40631d84a4af9952e842c4288badc9349e829ac6a7ffc015edb20a473d9f11: Status 404 returned error can't find the container with id 5f40631d84a4af9952e842c4288badc9349e829ac6a7ffc015edb20a473d9f11 Oct 09 08:00:05 crc kubenswrapper[4715]: I1009 08:00:05.399261 4715 generic.go:334] "Generic (PLEG): container finished" podID="ebf03b75-e185-426f-8c78-177b3514ff44" containerID="e75cc499439c1c6cce808ddc24fea5208e1845c868bf283cf2f9aa86a31b0417" exitCode=0 Oct 09 08:00:05 crc kubenswrapper[4715]: I1009 08:00:05.399307 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mwsn" event={"ID":"ebf03b75-e185-426f-8c78-177b3514ff44","Type":"ContainerDied","Data":"e75cc499439c1c6cce808ddc24fea5208e1845c868bf283cf2f9aa86a31b0417"} Oct 09 08:00:05 crc kubenswrapper[4715]: I1009 08:00:05.399653 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mwsn" event={"ID":"ebf03b75-e185-426f-8c78-177b3514ff44","Type":"ContainerStarted","Data":"5f40631d84a4af9952e842c4288badc9349e829ac6a7ffc015edb20a473d9f11"} Oct 09 08:00:07 crc kubenswrapper[4715]: I1009 08:00:07.413486 4715 generic.go:334] "Generic (PLEG): container finished" podID="ebf03b75-e185-426f-8c78-177b3514ff44" containerID="afd473bf10cef9e2aa88ef5bc7288f04b6b63247daef51e2fcd38723745d6f62" exitCode=0 Oct 09 08:00:07 crc kubenswrapper[4715]: I1009 08:00:07.413542 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mwsn" event={"ID":"ebf03b75-e185-426f-8c78-177b3514ff44","Type":"ContainerDied","Data":"afd473bf10cef9e2aa88ef5bc7288f04b6b63247daef51e2fcd38723745d6f62"} Oct 09 08:00:07 crc kubenswrapper[4715]: I1009 08:00:07.416541 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4" event={"ID":"2f70ba87-a4dd-4a97-a005-f63fec497e9f","Type":"ContainerStarted","Data":"e714500b37cd601e107c02f969d9c1aff04d616b96b8a29324d2d81b7ce37d05"} Oct 09 08:00:07 crc kubenswrapper[4715]: I1009 08:00:07.417168 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4" Oct 09 08:00:07 crc kubenswrapper[4715]: I1009 08:00:07.484059 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4" podStartSLOduration=1.892401413 podStartE2EDuration="8.484030514s" podCreationTimestamp="2025-10-09 07:59:59 +0000 UTC" firstStartedPulling="2025-10-09 08:00:00.079308615 +0000 UTC m=+830.772112623" lastFinishedPulling="2025-10-09 08:00:06.670937716 +0000 UTC m=+837.363741724" observedRunningTime="2025-10-09 08:00:07.481852701 +0000 UTC m=+838.174656719" watchObservedRunningTime="2025-10-09 08:00:07.484030514 +0000 UTC m=+838.176834522" Oct 09 08:00:08 crc kubenswrapper[4715]: I1009 08:00:08.424571 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mwsn" event={"ID":"ebf03b75-e185-426f-8c78-177b3514ff44","Type":"ContainerStarted","Data":"2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609"} Oct 09 08:00:08 crc kubenswrapper[4715]: I1009 08:00:08.443360 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2mwsn" podStartSLOduration=3.090270951 podStartE2EDuration="5.443341734s" podCreationTimestamp="2025-10-09 08:00:03 +0000 UTC" firstStartedPulling="2025-10-09 08:00:05.476137411 +0000 UTC m=+836.168941419" lastFinishedPulling="2025-10-09 08:00:07.829208194 +0000 UTC m=+838.522012202" observedRunningTime="2025-10-09 08:00:08.442202131 +0000 UTC m=+839.135006159" watchObservedRunningTime="2025-10-09 08:00:08.443341734 +0000 UTC m=+839.136145742" Oct 09 08:00:09 crc kubenswrapper[4715]: I1009 08:00:09.430747 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-75c7986888-vmsr4" Oct 09 08:00:14 crc kubenswrapper[4715]: I1009 08:00:14.187039 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:14 crc kubenswrapper[4715]: I1009 08:00:14.187656 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:14 crc kubenswrapper[4715]: I1009 08:00:14.230927 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:14 crc kubenswrapper[4715]: I1009 08:00:14.529674 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:14 crc kubenswrapper[4715]: I1009 08:00:14.578362 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2mwsn"] Oct 09 08:00:16 crc kubenswrapper[4715]: I1009 08:00:16.469780 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2mwsn" podUID="ebf03b75-e185-426f-8c78-177b3514ff44" containerName="registry-server" containerID="cri-o://2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609" gracePeriod=2 Oct 09 08:00:16 crc kubenswrapper[4715]: I1009 08:00:16.949513 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.048473 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t245w\" (UniqueName: \"kubernetes.io/projected/ebf03b75-e185-426f-8c78-177b3514ff44-kube-api-access-t245w\") pod \"ebf03b75-e185-426f-8c78-177b3514ff44\" (UID: \"ebf03b75-e185-426f-8c78-177b3514ff44\") " Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.048627 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf03b75-e185-426f-8c78-177b3514ff44-catalog-content\") pod \"ebf03b75-e185-426f-8c78-177b3514ff44\" (UID: \"ebf03b75-e185-426f-8c78-177b3514ff44\") " Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.048695 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf03b75-e185-426f-8c78-177b3514ff44-utilities\") pod \"ebf03b75-e185-426f-8c78-177b3514ff44\" (UID: \"ebf03b75-e185-426f-8c78-177b3514ff44\") " Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.049920 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf03b75-e185-426f-8c78-177b3514ff44-utilities" (OuterVolumeSpecName: "utilities") pod "ebf03b75-e185-426f-8c78-177b3514ff44" (UID: "ebf03b75-e185-426f-8c78-177b3514ff44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.050440 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebf03b75-e185-426f-8c78-177b3514ff44-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.061591 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf03b75-e185-426f-8c78-177b3514ff44-kube-api-access-t245w" (OuterVolumeSpecName: "kube-api-access-t245w") pod "ebf03b75-e185-426f-8c78-177b3514ff44" (UID: "ebf03b75-e185-426f-8c78-177b3514ff44"). InnerVolumeSpecName "kube-api-access-t245w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.106843 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf03b75-e185-426f-8c78-177b3514ff44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebf03b75-e185-426f-8c78-177b3514ff44" (UID: "ebf03b75-e185-426f-8c78-177b3514ff44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.152218 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t245w\" (UniqueName: \"kubernetes.io/projected/ebf03b75-e185-426f-8c78-177b3514ff44-kube-api-access-t245w\") on node \"crc\" DevicePath \"\"" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.152257 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebf03b75-e185-426f-8c78-177b3514ff44-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.498528 4715 generic.go:334] "Generic (PLEG): container finished" podID="ebf03b75-e185-426f-8c78-177b3514ff44" containerID="2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609" exitCode=0 Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.498573 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mwsn" event={"ID":"ebf03b75-e185-426f-8c78-177b3514ff44","Type":"ContainerDied","Data":"2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609"} Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.498605 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2mwsn" event={"ID":"ebf03b75-e185-426f-8c78-177b3514ff44","Type":"ContainerDied","Data":"5f40631d84a4af9952e842c4288badc9349e829ac6a7ffc015edb20a473d9f11"} Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.498611 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2mwsn" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.498626 4715 scope.go:117] "RemoveContainer" containerID="2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.525145 4715 scope.go:117] "RemoveContainer" containerID="afd473bf10cef9e2aa88ef5bc7288f04b6b63247daef51e2fcd38723745d6f62" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.542438 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2mwsn"] Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.550662 4715 scope.go:117] "RemoveContainer" containerID="e75cc499439c1c6cce808ddc24fea5208e1845c868bf283cf2f9aa86a31b0417" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.566671 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2mwsn"] Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.622612 4715 scope.go:117] "RemoveContainer" containerID="2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609" Oct 09 08:00:17 crc kubenswrapper[4715]: E1009 08:00:17.626578 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609\": container with ID starting with 2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609 not found: ID does not exist" containerID="2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.626630 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609"} err="failed to get container status \"2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609\": rpc error: code = NotFound desc = could not find container \"2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609\": container with ID starting with 2e62733fd526f5307181f8559f3fc219ad02ac0ba96a48a8979a8172e0835609 not found: ID does not exist" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.626660 4715 scope.go:117] "RemoveContainer" containerID="afd473bf10cef9e2aa88ef5bc7288f04b6b63247daef51e2fcd38723745d6f62" Oct 09 08:00:17 crc kubenswrapper[4715]: E1009 08:00:17.633564 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd473bf10cef9e2aa88ef5bc7288f04b6b63247daef51e2fcd38723745d6f62\": container with ID starting with afd473bf10cef9e2aa88ef5bc7288f04b6b63247daef51e2fcd38723745d6f62 not found: ID does not exist" containerID="afd473bf10cef9e2aa88ef5bc7288f04b6b63247daef51e2fcd38723745d6f62" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.633641 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd473bf10cef9e2aa88ef5bc7288f04b6b63247daef51e2fcd38723745d6f62"} err="failed to get container status \"afd473bf10cef9e2aa88ef5bc7288f04b6b63247daef51e2fcd38723745d6f62\": rpc error: code = NotFound desc = could not find container \"afd473bf10cef9e2aa88ef5bc7288f04b6b63247daef51e2fcd38723745d6f62\": container with ID starting with afd473bf10cef9e2aa88ef5bc7288f04b6b63247daef51e2fcd38723745d6f62 not found: ID does not exist" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.633670 4715 scope.go:117] "RemoveContainer" containerID="e75cc499439c1c6cce808ddc24fea5208e1845c868bf283cf2f9aa86a31b0417" Oct 09 08:00:17 crc kubenswrapper[4715]: E1009 08:00:17.639566 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e75cc499439c1c6cce808ddc24fea5208e1845c868bf283cf2f9aa86a31b0417\": container with ID starting with e75cc499439c1c6cce808ddc24fea5208e1845c868bf283cf2f9aa86a31b0417 not found: ID does not exist" containerID="e75cc499439c1c6cce808ddc24fea5208e1845c868bf283cf2f9aa86a31b0417" Oct 09 08:00:17 crc kubenswrapper[4715]: I1009 08:00:17.639613 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e75cc499439c1c6cce808ddc24fea5208e1845c868bf283cf2f9aa86a31b0417"} err="failed to get container status \"e75cc499439c1c6cce808ddc24fea5208e1845c868bf283cf2f9aa86a31b0417\": rpc error: code = NotFound desc = could not find container \"e75cc499439c1c6cce808ddc24fea5208e1845c868bf283cf2f9aa86a31b0417\": container with ID starting with e75cc499439c1c6cce808ddc24fea5208e1845c868bf283cf2f9aa86a31b0417 not found: ID does not exist" Oct 09 08:00:18 crc kubenswrapper[4715]: I1009 08:00:18.144876 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf03b75-e185-426f-8c78-177b3514ff44" path="/var/lib/kubelet/pods/ebf03b75-e185-426f-8c78-177b3514ff44/volumes" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.520100 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sxpbc"] Oct 09 08:00:22 crc kubenswrapper[4715]: E1009 08:00:22.520712 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf03b75-e185-426f-8c78-177b3514ff44" containerName="registry-server" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.520726 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf03b75-e185-426f-8c78-177b3514ff44" containerName="registry-server" Oct 09 08:00:22 crc kubenswrapper[4715]: E1009 08:00:22.520738 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d35a0e3-8f99-408e-9f03-eb29d705d730" containerName="collect-profiles" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.520743 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d35a0e3-8f99-408e-9f03-eb29d705d730" containerName="collect-profiles" Oct 09 08:00:22 crc kubenswrapper[4715]: E1009 08:00:22.520753 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf03b75-e185-426f-8c78-177b3514ff44" containerName="extract-utilities" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.520760 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf03b75-e185-426f-8c78-177b3514ff44" containerName="extract-utilities" Oct 09 08:00:22 crc kubenswrapper[4715]: E1009 08:00:22.520767 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf03b75-e185-426f-8c78-177b3514ff44" containerName="extract-content" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.520774 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf03b75-e185-426f-8c78-177b3514ff44" containerName="extract-content" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.520908 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d35a0e3-8f99-408e-9f03-eb29d705d730" containerName="collect-profiles" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.520924 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf03b75-e185-426f-8c78-177b3514ff44" containerName="registry-server" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.521982 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.554109 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxpbc"] Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.649093 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fefbe0c-de73-415c-a42f-77742a8afab2-utilities\") pod \"redhat-marketplace-sxpbc\" (UID: \"8fefbe0c-de73-415c-a42f-77742a8afab2\") " pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.649162 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fefbe0c-de73-415c-a42f-77742a8afab2-catalog-content\") pod \"redhat-marketplace-sxpbc\" (UID: \"8fefbe0c-de73-415c-a42f-77742a8afab2\") " pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.649183 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp2gm\" (UniqueName: \"kubernetes.io/projected/8fefbe0c-de73-415c-a42f-77742a8afab2-kube-api-access-jp2gm\") pod \"redhat-marketplace-sxpbc\" (UID: \"8fefbe0c-de73-415c-a42f-77742a8afab2\") " pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.750046 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fefbe0c-de73-415c-a42f-77742a8afab2-utilities\") pod \"redhat-marketplace-sxpbc\" (UID: \"8fefbe0c-de73-415c-a42f-77742a8afab2\") " pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.750132 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fefbe0c-de73-415c-a42f-77742a8afab2-catalog-content\") pod \"redhat-marketplace-sxpbc\" (UID: \"8fefbe0c-de73-415c-a42f-77742a8afab2\") " pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.750158 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp2gm\" (UniqueName: \"kubernetes.io/projected/8fefbe0c-de73-415c-a42f-77742a8afab2-kube-api-access-jp2gm\") pod \"redhat-marketplace-sxpbc\" (UID: \"8fefbe0c-de73-415c-a42f-77742a8afab2\") " pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.750694 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fefbe0c-de73-415c-a42f-77742a8afab2-utilities\") pod \"redhat-marketplace-sxpbc\" (UID: \"8fefbe0c-de73-415c-a42f-77742a8afab2\") " pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.750762 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fefbe0c-de73-415c-a42f-77742a8afab2-catalog-content\") pod \"redhat-marketplace-sxpbc\" (UID: \"8fefbe0c-de73-415c-a42f-77742a8afab2\") " pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.776982 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp2gm\" (UniqueName: \"kubernetes.io/projected/8fefbe0c-de73-415c-a42f-77742a8afab2-kube-api-access-jp2gm\") pod \"redhat-marketplace-sxpbc\" (UID: \"8fefbe0c-de73-415c-a42f-77742a8afab2\") " pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:22 crc kubenswrapper[4715]: I1009 08:00:22.842223 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:23 crc kubenswrapper[4715]: I1009 08:00:23.083780 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxpbc"] Oct 09 08:00:23 crc kubenswrapper[4715]: I1009 08:00:23.537888 4715 generic.go:334] "Generic (PLEG): container finished" podID="8fefbe0c-de73-415c-a42f-77742a8afab2" containerID="0a9457d18a0f2331b6fd8264530ec35a667ad4f04cc4e7aea2cf59657d325a8f" exitCode=0 Oct 09 08:00:23 crc kubenswrapper[4715]: I1009 08:00:23.537944 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxpbc" event={"ID":"8fefbe0c-de73-415c-a42f-77742a8afab2","Type":"ContainerDied","Data":"0a9457d18a0f2331b6fd8264530ec35a667ad4f04cc4e7aea2cf59657d325a8f"} Oct 09 08:00:23 crc kubenswrapper[4715]: I1009 08:00:23.537976 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxpbc" event={"ID":"8fefbe0c-de73-415c-a42f-77742a8afab2","Type":"ContainerStarted","Data":"95b460695b69283a152ffd32cc9c5cd6c07eb50dd396d9b5b4d94dfd33bd5357"} Oct 09 08:00:25 crc kubenswrapper[4715]: I1009 08:00:25.551007 4715 generic.go:334] "Generic (PLEG): container finished" podID="8fefbe0c-de73-415c-a42f-77742a8afab2" containerID="bd3a9032a83368791b9f9dba116e2a08acd8f39a40aa9224e051e7d070e79e3e" exitCode=0 Oct 09 08:00:25 crc kubenswrapper[4715]: I1009 08:00:25.551746 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxpbc" event={"ID":"8fefbe0c-de73-415c-a42f-77742a8afab2","Type":"ContainerDied","Data":"bd3a9032a83368791b9f9dba116e2a08acd8f39a40aa9224e051e7d070e79e3e"} Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.371482 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.373405 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.377245 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5k6n2" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.389003 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.400795 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.402291 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.405399 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5m2ww" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.406735 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2zfv\" (UniqueName: \"kubernetes.io/projected/e4603d13-cf9d-4d8d-82db-3b182aa42e74-kube-api-access-q2zfv\") pod \"barbican-operator-controller-manager-64f84fcdbb-rsthg\" (UID: \"e4603d13-cf9d-4d8d-82db-3b182aa42e74\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.411058 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.419385 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.420705 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.423273 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-qpwvk" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.441657 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.468485 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.469937 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.472134 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-b9v2x" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.490127 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.499153 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.512277 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6m2x\" (UniqueName: \"kubernetes.io/projected/e11fc796-233e-4c17-b953-1c6211f0c679-kube-api-access-z6m2x\") pod \"glance-operator-controller-manager-7bb46cd7d-cfkg2\" (UID: \"e11fc796-233e-4c17-b953-1c6211f0c679\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.512368 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgms5\" (UniqueName: \"kubernetes.io/projected/675a8b37-dcfc-414e-9218-7741ce9ec2d5-kube-api-access-kgms5\") pod \"designate-operator-controller-manager-687df44cdb-v8zt5\" (UID: \"675a8b37-dcfc-414e-9218-7741ce9ec2d5\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.512435 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2zfv\" (UniqueName: \"kubernetes.io/projected/e4603d13-cf9d-4d8d-82db-3b182aa42e74-kube-api-access-q2zfv\") pod \"barbican-operator-controller-manager-64f84fcdbb-rsthg\" (UID: \"e4603d13-cf9d-4d8d-82db-3b182aa42e74\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.512479 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbnkx\" (UniqueName: \"kubernetes.io/projected/68110204-494d-4a10-b25d-0996c9dd1c6f-kube-api-access-nbnkx\") pod \"cinder-operator-controller-manager-59cdc64769-gqdw4\" (UID: \"68110204-494d-4a10-b25d-0996c9dd1c6f\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.522711 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.524211 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.524525 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.524828 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.533637 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.534163 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-q9z47" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.534880 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-zfg9x" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.548240 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.548656 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.556151 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.556569 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-j6drt" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.556786 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.557726 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2zfv\" (UniqueName: \"kubernetes.io/projected/e4603d13-cf9d-4d8d-82db-3b182aa42e74-kube-api-access-q2zfv\") pod \"barbican-operator-controller-manager-64f84fcdbb-rsthg\" (UID: \"e4603d13-cf9d-4d8d-82db-3b182aa42e74\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.558074 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.561787 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-fmplt" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.567745 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.569078 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.587941 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-jgbmq" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.600803 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.607735 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.614739 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b39f3e52-f97a-4bf4-934d-88267bddae91-cert\") pod \"infra-operator-controller-manager-585fc5b659-ckx8h\" (UID: \"b39f3e52-f97a-4bf4-934d-88267bddae91\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.614861 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbnkx\" (UniqueName: \"kubernetes.io/projected/68110204-494d-4a10-b25d-0996c9dd1c6f-kube-api-access-nbnkx\") pod \"cinder-operator-controller-manager-59cdc64769-gqdw4\" (UID: \"68110204-494d-4a10-b25d-0996c9dd1c6f\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.614958 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgsqr\" (UniqueName: \"kubernetes.io/projected/2730bf5c-42b9-4739-a2bc-6250bfcb997a-kube-api-access-dgsqr\") pod \"horizon-operator-controller-manager-6d74794d9b-jps2w\" (UID: \"2730bf5c-42b9-4739-a2bc-6250bfcb997a\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.615026 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxs72\" (UniqueName: \"kubernetes.io/projected/32b6325f-e041-492d-a113-638dcef15310-kube-api-access-kxs72\") pod \"heat-operator-controller-manager-6d9967f8dd-qjwrk\" (UID: \"32b6325f-e041-492d-a113-638dcef15310\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.615161 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nz2r\" (UniqueName: \"kubernetes.io/projected/c990a4aa-4a8e-499b-bf58-99c469af523e-kube-api-access-9nz2r\") pod \"ironic-operator-controller-manager-74cb5cbc49-pxmc4\" (UID: \"c990a4aa-4a8e-499b-bf58-99c469af523e\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.615296 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6m2x\" (UniqueName: \"kubernetes.io/projected/e11fc796-233e-4c17-b953-1c6211f0c679-kube-api-access-z6m2x\") pod \"glance-operator-controller-manager-7bb46cd7d-cfkg2\" (UID: \"e11fc796-233e-4c17-b953-1c6211f0c679\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.615488 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgms5\" (UniqueName: \"kubernetes.io/projected/675a8b37-dcfc-414e-9218-7741ce9ec2d5-kube-api-access-kgms5\") pod \"designate-operator-controller-manager-687df44cdb-v8zt5\" (UID: \"675a8b37-dcfc-414e-9218-7741ce9ec2d5\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.615558 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4mw7\" (UniqueName: \"kubernetes.io/projected/9657b932-fe63-4417-8463-8af21e9c9790-kube-api-access-z4mw7\") pod \"keystone-operator-controller-manager-ddb98f99b-7zmwl\" (UID: \"9657b932-fe63-4417-8463-8af21e9c9790\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.615589 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8x4k\" (UniqueName: \"kubernetes.io/projected/b39f3e52-f97a-4bf4-934d-88267bddae91-kube-api-access-w8x4k\") pod \"infra-operator-controller-manager-585fc5b659-ckx8h\" (UID: \"b39f3e52-f97a-4bf4-934d-88267bddae91\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.621342 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.628947 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-6zczp"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.629952 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.634007 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-gxzvb" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.643041 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgms5\" (UniqueName: \"kubernetes.io/projected/675a8b37-dcfc-414e-9218-7741ce9ec2d5-kube-api-access-kgms5\") pod \"designate-operator-controller-manager-687df44cdb-v8zt5\" (UID: \"675a8b37-dcfc-414e-9218-7741ce9ec2d5\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.654906 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6m2x\" (UniqueName: \"kubernetes.io/projected/e11fc796-233e-4c17-b953-1c6211f0c679-kube-api-access-z6m2x\") pod \"glance-operator-controller-manager-7bb46cd7d-cfkg2\" (UID: \"e11fc796-233e-4c17-b953-1c6211f0c679\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.661140 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbnkx\" (UniqueName: \"kubernetes.io/projected/68110204-494d-4a10-b25d-0996c9dd1c6f-kube-api-access-nbnkx\") pod \"cinder-operator-controller-manager-59cdc64769-gqdw4\" (UID: \"68110204-494d-4a10-b25d-0996c9dd1c6f\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.666302 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-6zczp"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.690524 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.691969 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.697967 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-6nhvz" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.703857 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.705052 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.707576 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ggjr7" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.708106 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.731732 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcmz4\" (UniqueName: \"kubernetes.io/projected/9cae911a-5b69-4cf4-aa26-4adb4457eec4-kube-api-access-jcmz4\") pod \"manila-operator-controller-manager-59578bc799-6zczp\" (UID: \"9cae911a-5b69-4cf4-aa26-4adb4457eec4\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.731816 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgsqr\" (UniqueName: \"kubernetes.io/projected/2730bf5c-42b9-4739-a2bc-6250bfcb997a-kube-api-access-dgsqr\") pod \"horizon-operator-controller-manager-6d74794d9b-jps2w\" (UID: \"2730bf5c-42b9-4739-a2bc-6250bfcb997a\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.731864 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxs72\" (UniqueName: \"kubernetes.io/projected/32b6325f-e041-492d-a113-638dcef15310-kube-api-access-kxs72\") pod \"heat-operator-controller-manager-6d9967f8dd-qjwrk\" (UID: \"32b6325f-e041-492d-a113-638dcef15310\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.731935 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nz2r\" (UniqueName: \"kubernetes.io/projected/c990a4aa-4a8e-499b-bf58-99c469af523e-kube-api-access-9nz2r\") pod \"ironic-operator-controller-manager-74cb5cbc49-pxmc4\" (UID: \"c990a4aa-4a8e-499b-bf58-99c469af523e\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.732028 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4mw7\" (UniqueName: \"kubernetes.io/projected/9657b932-fe63-4417-8463-8af21e9c9790-kube-api-access-z4mw7\") pod \"keystone-operator-controller-manager-ddb98f99b-7zmwl\" (UID: \"9657b932-fe63-4417-8463-8af21e9c9790\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.732066 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8x4k\" (UniqueName: \"kubernetes.io/projected/b39f3e52-f97a-4bf4-934d-88267bddae91-kube-api-access-w8x4k\") pod \"infra-operator-controller-manager-585fc5b659-ckx8h\" (UID: \"b39f3e52-f97a-4bf4-934d-88267bddae91\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.732114 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b39f3e52-f97a-4bf4-934d-88267bddae91-cert\") pod \"infra-operator-controller-manager-585fc5b659-ckx8h\" (UID: \"b39f3e52-f97a-4bf4-934d-88267bddae91\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" Oct 09 08:00:26 crc kubenswrapper[4715]: E1009 08:00:26.732305 4715 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 09 08:00:26 crc kubenswrapper[4715]: E1009 08:00:26.732382 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b39f3e52-f97a-4bf4-934d-88267bddae91-cert podName:b39f3e52-f97a-4bf4-934d-88267bddae91 nodeName:}" failed. No retries permitted until 2025-10-09 08:00:27.232359257 +0000 UTC m=+857.925163265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b39f3e52-f97a-4bf4-934d-88267bddae91-cert") pod "infra-operator-controller-manager-585fc5b659-ckx8h" (UID: "b39f3e52-f97a-4bf4-934d-88267bddae91") : secret "infra-operator-webhook-server-cert" not found Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.751898 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.772694 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.796693 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxs72\" (UniqueName: \"kubernetes.io/projected/32b6325f-e041-492d-a113-638dcef15310-kube-api-access-kxs72\") pod \"heat-operator-controller-manager-6d9967f8dd-qjwrk\" (UID: \"32b6325f-e041-492d-a113-638dcef15310\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.799579 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgsqr\" (UniqueName: \"kubernetes.io/projected/2730bf5c-42b9-4739-a2bc-6250bfcb997a-kube-api-access-dgsqr\") pod \"horizon-operator-controller-manager-6d74794d9b-jps2w\" (UID: \"2730bf5c-42b9-4739-a2bc-6250bfcb997a\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.800656 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8x4k\" (UniqueName: \"kubernetes.io/projected/b39f3e52-f97a-4bf4-934d-88267bddae91-kube-api-access-w8x4k\") pod \"infra-operator-controller-manager-585fc5b659-ckx8h\" (UID: \"b39f3e52-f97a-4bf4-934d-88267bddae91\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.801043 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.802207 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nz2r\" (UniqueName: \"kubernetes.io/projected/c990a4aa-4a8e-499b-bf58-99c469af523e-kube-api-access-9nz2r\") pod \"ironic-operator-controller-manager-74cb5cbc49-pxmc4\" (UID: \"c990a4aa-4a8e-499b-bf58-99c469af523e\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.803132 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4mw7\" (UniqueName: \"kubernetes.io/projected/9657b932-fe63-4417-8463-8af21e9c9790-kube-api-access-z4mw7\") pod \"keystone-operator-controller-manager-ddb98f99b-7zmwl\" (UID: \"9657b932-fe63-4417-8463-8af21e9c9790\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.803406 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.820579 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.821777 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.821922 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.827220 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-nncls" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.827504 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.833736 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc667\" (UniqueName: \"kubernetes.io/projected/ecf88dec-957f-4221-8ded-d779392c2793-kube-api-access-dc667\") pod \"neutron-operator-controller-manager-797d478b46-kqhg2\" (UID: \"ecf88dec-957f-4221-8ded-d779392c2793\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.833904 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcmz4\" (UniqueName: \"kubernetes.io/projected/9cae911a-5b69-4cf4-aa26-4adb4457eec4-kube-api-access-jcmz4\") pod \"manila-operator-controller-manager-59578bc799-6zczp\" (UID: \"9cae911a-5b69-4cf4-aa26-4adb4457eec4\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.833940 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4pqs\" (UniqueName: \"kubernetes.io/projected/ad178d55-a5d5-40b5-9364-0a9af0718f46-kube-api-access-d4pqs\") pod \"mariadb-operator-controller-manager-5777b4f897-ggwkb\" (UID: \"ad178d55-a5d5-40b5-9364-0a9af0718f46\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.851534 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.853006 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.856654 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ghbqf" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.865039 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcmz4\" (UniqueName: \"kubernetes.io/projected/9cae911a-5b69-4cf4-aa26-4adb4457eec4-kube-api-access-jcmz4\") pod \"manila-operator-controller-manager-59578bc799-6zczp\" (UID: \"9cae911a-5b69-4cf4-aa26-4adb4457eec4\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.866770 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.908056 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.920119 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll"] Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.921532 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.940055 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4pqs\" (UniqueName: \"kubernetes.io/projected/ad178d55-a5d5-40b5-9364-0a9af0718f46-kube-api-access-d4pqs\") pod \"mariadb-operator-controller-manager-5777b4f897-ggwkb\" (UID: \"ad178d55-a5d5-40b5-9364-0a9af0718f46\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.940096 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dqfl\" (UniqueName: \"kubernetes.io/projected/6d11d372-6981-432f-a2b0-364cb9b24f63-kube-api-access-8dqfl\") pod \"nova-operator-controller-manager-57bb74c7bf-6xd27\" (UID: \"6d11d372-6981-432f-a2b0-364cb9b24f63\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.940126 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc667\" (UniqueName: \"kubernetes.io/projected/ecf88dec-957f-4221-8ded-d779392c2793-kube-api-access-dc667\") pod \"neutron-operator-controller-manager-797d478b46-kqhg2\" (UID: \"ecf88dec-957f-4221-8ded-d779392c2793\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.940155 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz52q\" (UniqueName: \"kubernetes.io/projected/619ad411-d5d7-431b-9bb6-6cf084134aaf-kube-api-access-nz52q\") pod \"octavia-operator-controller-manager-6d7c7ddf95-xmq4r\" (UID: \"619ad411-d5d7-431b-9bb6-6cf084134aaf\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.942146 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.942301 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-h6ql2" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.979829 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc667\" (UniqueName: \"kubernetes.io/projected/ecf88dec-957f-4221-8ded-d779392c2793-kube-api-access-dc667\") pod \"neutron-operator-controller-manager-797d478b46-kqhg2\" (UID: \"ecf88dec-957f-4221-8ded-d779392c2793\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2" Oct 09 08:00:26 crc kubenswrapper[4715]: I1009 08:00:26.980471 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.004558 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.009538 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4pqs\" (UniqueName: \"kubernetes.io/projected/ad178d55-a5d5-40b5-9364-0a9af0718f46-kube-api-access-d4pqs\") pod \"mariadb-operator-controller-manager-5777b4f897-ggwkb\" (UID: \"ad178d55-a5d5-40b5-9364-0a9af0718f46\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.018065 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.023210 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.023847 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.025234 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.037402 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.039362 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-6x99w" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.045819 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz52q\" (UniqueName: \"kubernetes.io/projected/619ad411-d5d7-431b-9bb6-6cf084134aaf-kube-api-access-nz52q\") pod \"octavia-operator-controller-manager-6d7c7ddf95-xmq4r\" (UID: \"619ad411-d5d7-431b-9bb6-6cf084134aaf\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.045865 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9335457-1cad-453a-9539-d73dc2c77021-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll\" (UID: \"e9335457-1cad-453a-9539-d73dc2c77021\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.045922 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxqv6\" (UniqueName: \"kubernetes.io/projected/e9335457-1cad-453a-9539-d73dc2c77021-kube-api-access-sxqv6\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll\" (UID: \"e9335457-1cad-453a-9539-d73dc2c77021\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.046021 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dqfl\" (UniqueName: \"kubernetes.io/projected/6d11d372-6981-432f-a2b0-364cb9b24f63-kube-api-access-8dqfl\") pod \"nova-operator-controller-manager-57bb74c7bf-6xd27\" (UID: \"6d11d372-6981-432f-a2b0-364cb9b24f63\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.058478 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.059671 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.068355 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.078125 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-mf8ts" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.078772 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.080722 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.081094 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.089516 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dqfl\" (UniqueName: \"kubernetes.io/projected/6d11d372-6981-432f-a2b0-364cb9b24f63-kube-api-access-8dqfl\") pod \"nova-operator-controller-manager-57bb74c7bf-6xd27\" (UID: \"6d11d372-6981-432f-a2b0-364cb9b24f63\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.089952 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-fq5mq" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.090793 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-2dw2j" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.100937 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.113113 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.113180 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.119361 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz52q\" (UniqueName: \"kubernetes.io/projected/619ad411-d5d7-431b-9bb6-6cf084134aaf-kube-api-access-nz52q\") pod \"octavia-operator-controller-manager-6d7c7ddf95-xmq4r\" (UID: \"619ad411-d5d7-431b-9bb6-6cf084134aaf\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.136288 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.138249 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.141039 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-f8w5n" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.141869 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.149860 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh2qt\" (UniqueName: \"kubernetes.io/projected/6d1ea812-36f3-4478-9b78-aed194390313-kube-api-access-sh2qt\") pod \"ovn-operator-controller-manager-6f96f8c84-5plxn\" (UID: \"6d1ea812-36f3-4478-9b78-aed194390313\") " pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.149917 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slkh4\" (UniqueName: \"kubernetes.io/projected/9b402da9-cbb2-473b-beee-7064e06acb73-kube-api-access-slkh4\") pod \"telemetry-operator-controller-manager-6648b66598-cvsqm\" (UID: \"9b402da9-cbb2-473b-beee-7064e06acb73\") " pod="openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.149974 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9335457-1cad-453a-9539-d73dc2c77021-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll\" (UID: \"e9335457-1cad-453a-9539-d73dc2c77021\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.150022 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxqv6\" (UniqueName: \"kubernetes.io/projected/e9335457-1cad-453a-9539-d73dc2c77021-kube-api-access-sxqv6\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll\" (UID: \"e9335457-1cad-453a-9539-d73dc2c77021\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.150055 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd2w5\" (UniqueName: \"kubernetes.io/projected/fc37f3a9-94a5-4957-939a-a0b0a7a567bb-kube-api-access-nd2w5\") pod \"placement-operator-controller-manager-664664cb68-xwt7q\" (UID: \"fc37f3a9-94a5-4957-939a-a0b0a7a567bb\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.150095 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jfz4\" (UniqueName: \"kubernetes.io/projected/4b8010cb-d8af-4b7c-9530-fe143bbf1ddb-kube-api-access-5jfz4\") pod \"swift-operator-controller-manager-5f4d5dfdc6-rcm5h\" (UID: \"4b8010cb-d8af-4b7c-9530-fe143bbf1ddb\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" Oct 09 08:00:27 crc kubenswrapper[4715]: E1009 08:00:27.150248 4715 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 08:00:27 crc kubenswrapper[4715]: E1009 08:00:27.150299 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9335457-1cad-453a-9539-d73dc2c77021-cert podName:e9335457-1cad-453a-9539-d73dc2c77021 nodeName:}" failed. No retries permitted until 2025-10-09 08:00:27.650279806 +0000 UTC m=+858.343083814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9335457-1cad-453a-9539-d73dc2c77021-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" (UID: "e9335457-1cad-453a-9539-d73dc2c77021") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.158816 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.175845 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxqv6\" (UniqueName: \"kubernetes.io/projected/e9335457-1cad-453a-9539-d73dc2c77021-kube-api-access-sxqv6\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll\" (UID: \"e9335457-1cad-453a-9539-d73dc2c77021\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.184443 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.185322 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.185718 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.199379 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.200298 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-szzpk" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.213269 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.253008 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd2w5\" (UniqueName: \"kubernetes.io/projected/fc37f3a9-94a5-4957-939a-a0b0a7a567bb-kube-api-access-nd2w5\") pod \"placement-operator-controller-manager-664664cb68-xwt7q\" (UID: \"fc37f3a9-94a5-4957-939a-a0b0a7a567bb\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.253063 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmg56\" (UniqueName: \"kubernetes.io/projected/ba259ec1-9157-4cd9-8c21-11915efe5dde-kube-api-access-wmg56\") pod \"test-operator-controller-manager-ffcdd6c94-rt6nt\" (UID: \"ba259ec1-9157-4cd9-8c21-11915efe5dde\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.253114 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mlnp\" (UniqueName: \"kubernetes.io/projected/666e7073-bf77-46f3-99da-5ad2013835a9-kube-api-access-9mlnp\") pod \"watcher-operator-controller-manager-646675d848-bgwc8\" (UID: \"666e7073-bf77-46f3-99da-5ad2013835a9\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.253173 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jfz4\" (UniqueName: \"kubernetes.io/projected/4b8010cb-d8af-4b7c-9530-fe143bbf1ddb-kube-api-access-5jfz4\") pod \"swift-operator-controller-manager-5f4d5dfdc6-rcm5h\" (UID: \"4b8010cb-d8af-4b7c-9530-fe143bbf1ddb\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.253234 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b39f3e52-f97a-4bf4-934d-88267bddae91-cert\") pod \"infra-operator-controller-manager-585fc5b659-ckx8h\" (UID: \"b39f3e52-f97a-4bf4-934d-88267bddae91\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.253257 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh2qt\" (UniqueName: \"kubernetes.io/projected/6d1ea812-36f3-4478-9b78-aed194390313-kube-api-access-sh2qt\") pod \"ovn-operator-controller-manager-6f96f8c84-5plxn\" (UID: \"6d1ea812-36f3-4478-9b78-aed194390313\") " pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.253281 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slkh4\" (UniqueName: \"kubernetes.io/projected/9b402da9-cbb2-473b-beee-7064e06acb73-kube-api-access-slkh4\") pod \"telemetry-operator-controller-manager-6648b66598-cvsqm\" (UID: \"9b402da9-cbb2-473b-beee-7064e06acb73\") " pod="openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.254031 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" Oct 09 08:00:27 crc kubenswrapper[4715]: E1009 08:00:27.254943 4715 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 09 08:00:27 crc kubenswrapper[4715]: E1009 08:00:27.254994 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b39f3e52-f97a-4bf4-934d-88267bddae91-cert podName:b39f3e52-f97a-4bf4-934d-88267bddae91 nodeName:}" failed. No retries permitted until 2025-10-09 08:00:28.254979649 +0000 UTC m=+858.947783657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b39f3e52-f97a-4bf4-934d-88267bddae91-cert") pod "infra-operator-controller-manager-585fc5b659-ckx8h" (UID: "b39f3e52-f97a-4bf4-934d-88267bddae91") : secret "infra-operator-webhook-server-cert" not found Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.283819 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slkh4\" (UniqueName: \"kubernetes.io/projected/9b402da9-cbb2-473b-beee-7064e06acb73-kube-api-access-slkh4\") pod \"telemetry-operator-controller-manager-6648b66598-cvsqm\" (UID: \"9b402da9-cbb2-473b-beee-7064e06acb73\") " pod="openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.288897 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jfz4\" (UniqueName: \"kubernetes.io/projected/4b8010cb-d8af-4b7c-9530-fe143bbf1ddb-kube-api-access-5jfz4\") pod \"swift-operator-controller-manager-5f4d5dfdc6-rcm5h\" (UID: \"4b8010cb-d8af-4b7c-9530-fe143bbf1ddb\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.288952 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.300265 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd2w5\" (UniqueName: \"kubernetes.io/projected/fc37f3a9-94a5-4957-939a-a0b0a7a567bb-kube-api-access-nd2w5\") pod \"placement-operator-controller-manager-664664cb68-xwt7q\" (UID: \"fc37f3a9-94a5-4957-939a-a0b0a7a567bb\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.305957 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.306611 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.321189 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-v9kb2" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.322654 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.341020 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh2qt\" (UniqueName: \"kubernetes.io/projected/6d1ea812-36f3-4478-9b78-aed194390313-kube-api-access-sh2qt\") pod \"ovn-operator-controller-manager-6f96f8c84-5plxn\" (UID: \"6d1ea812-36f3-4478-9b78-aed194390313\") " pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.358481 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd59cd6f-8b57-4377-80ae-a1873494f103-cert\") pod \"openstack-operator-controller-manager-5b6844c9b7-z7vpp\" (UID: \"fd59cd6f-8b57-4377-80ae-a1873494f103\") " pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.358532 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkdnw\" (UniqueName: \"kubernetes.io/projected/fd59cd6f-8b57-4377-80ae-a1873494f103-kube-api-access-fkdnw\") pod \"openstack-operator-controller-manager-5b6844c9b7-z7vpp\" (UID: \"fd59cd6f-8b57-4377-80ae-a1873494f103\") " pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.358568 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmg56\" (UniqueName: \"kubernetes.io/projected/ba259ec1-9157-4cd9-8c21-11915efe5dde-kube-api-access-wmg56\") pod \"test-operator-controller-manager-ffcdd6c94-rt6nt\" (UID: \"ba259ec1-9157-4cd9-8c21-11915efe5dde\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.358612 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mlnp\" (UniqueName: \"kubernetes.io/projected/666e7073-bf77-46f3-99da-5ad2013835a9-kube-api-access-9mlnp\") pod \"watcher-operator-controller-manager-646675d848-bgwc8\" (UID: \"666e7073-bf77-46f3-99da-5ad2013835a9\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.366095 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.402730 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.404720 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmg56\" (UniqueName: \"kubernetes.io/projected/ba259ec1-9157-4cd9-8c21-11915efe5dde-kube-api-access-wmg56\") pod \"test-operator-controller-manager-ffcdd6c94-rt6nt\" (UID: \"ba259ec1-9157-4cd9-8c21-11915efe5dde\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.408042 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mlnp\" (UniqueName: \"kubernetes.io/projected/666e7073-bf77-46f3-99da-5ad2013835a9-kube-api-access-9mlnp\") pod \"watcher-operator-controller-manager-646675d848-bgwc8\" (UID: \"666e7073-bf77-46f3-99da-5ad2013835a9\") " pod="openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.431776 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.432576 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.434126 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.446527 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.452571 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.455899 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.458454 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-7nh9s" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.459490 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd59cd6f-8b57-4377-80ae-a1873494f103-cert\") pod \"openstack-operator-controller-manager-5b6844c9b7-z7vpp\" (UID: \"fd59cd6f-8b57-4377-80ae-a1873494f103\") " pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.459599 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkdnw\" (UniqueName: \"kubernetes.io/projected/fd59cd6f-8b57-4377-80ae-a1873494f103-kube-api-access-fkdnw\") pod \"openstack-operator-controller-manager-5b6844c9b7-z7vpp\" (UID: \"fd59cd6f-8b57-4377-80ae-a1873494f103\") " pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" Oct 09 08:00:27 crc kubenswrapper[4715]: E1009 08:00:27.460036 4715 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 09 08:00:27 crc kubenswrapper[4715]: E1009 08:00:27.460176 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd59cd6f-8b57-4377-80ae-a1873494f103-cert podName:fd59cd6f-8b57-4377-80ae-a1873494f103 nodeName:}" failed. No retries permitted until 2025-10-09 08:00:27.960159483 +0000 UTC m=+858.652963491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd59cd6f-8b57-4377-80ae-a1873494f103-cert") pod "openstack-operator-controller-manager-5b6844c9b7-z7vpp" (UID: "fd59cd6f-8b57-4377-80ae-a1873494f103") : secret "webhook-server-cert" not found Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.463742 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.481505 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.491456 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkdnw\" (UniqueName: \"kubernetes.io/projected/fd59cd6f-8b57-4377-80ae-a1873494f103-kube-api-access-fkdnw\") pod \"openstack-operator-controller-manager-5b6844c9b7-z7vpp\" (UID: \"fd59cd6f-8b57-4377-80ae-a1873494f103\") " pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.531857 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.550563 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4"] Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.562117 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84l75\" (UniqueName: \"kubernetes.io/projected/48619024-da5f-4b28-8724-3707961de8ce-kube-api-access-84l75\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s\" (UID: \"48619024-da5f-4b28-8724-3707961de8ce\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.583641 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg" event={"ID":"e4603d13-cf9d-4d8d-82db-3b182aa42e74","Type":"ContainerStarted","Data":"abeebfaa8c228437907a903b18abc49b8ef4feeddf0c725f564f7c2b82478389"} Oct 09 08:00:27 crc kubenswrapper[4715]: W1009 08:00:27.623707 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68110204_494d_4a10_b25d_0996c9dd1c6f.slice/crio-14d08517fd3b63b3c919517bb7a57c0cba606c4675ba2d47e3d897cb26065474 WatchSource:0}: Error finding container 14d08517fd3b63b3c919517bb7a57c0cba606c4675ba2d47e3d897cb26065474: Status 404 returned error can't find the container with id 14d08517fd3b63b3c919517bb7a57c0cba606c4675ba2d47e3d897cb26065474 Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.666828 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9335457-1cad-453a-9539-d73dc2c77021-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll\" (UID: \"e9335457-1cad-453a-9539-d73dc2c77021\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.666897 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84l75\" (UniqueName: \"kubernetes.io/projected/48619024-da5f-4b28-8724-3707961de8ce-kube-api-access-84l75\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s\" (UID: \"48619024-da5f-4b28-8724-3707961de8ce\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s" Oct 09 08:00:27 crc kubenswrapper[4715]: E1009 08:00:27.667046 4715 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 08:00:27 crc kubenswrapper[4715]: E1009 08:00:27.667146 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9335457-1cad-453a-9539-d73dc2c77021-cert podName:e9335457-1cad-453a-9539-d73dc2c77021 nodeName:}" failed. No retries permitted until 2025-10-09 08:00:28.667119489 +0000 UTC m=+859.359923497 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9335457-1cad-453a-9539-d73dc2c77021-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" (UID: "e9335457-1cad-453a-9539-d73dc2c77021") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.686154 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84l75\" (UniqueName: \"kubernetes.io/projected/48619024-da5f-4b28-8724-3707961de8ce-kube-api-access-84l75\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s\" (UID: \"48619024-da5f-4b28-8724-3707961de8ce\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.860818 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s" Oct 09 08:00:27 crc kubenswrapper[4715]: I1009 08:00:27.978302 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd59cd6f-8b57-4377-80ae-a1873494f103-cert\") pod \"openstack-operator-controller-manager-5b6844c9b7-z7vpp\" (UID: \"fd59cd6f-8b57-4377-80ae-a1873494f103\") " pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" Oct 09 08:00:27 crc kubenswrapper[4715]: E1009 08:00:27.978505 4715 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 09 08:00:27 crc kubenswrapper[4715]: E1009 08:00:27.978574 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd59cd6f-8b57-4377-80ae-a1873494f103-cert podName:fd59cd6f-8b57-4377-80ae-a1873494f103 nodeName:}" failed. No retries permitted until 2025-10-09 08:00:28.978555822 +0000 UTC m=+859.671359830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd59cd6f-8b57-4377-80ae-a1873494f103-cert") pod "openstack-operator-controller-manager-5b6844c9b7-z7vpp" (UID: "fd59cd6f-8b57-4377-80ae-a1873494f103") : secret "webhook-server-cert" not found Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.119778 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2"] Oct 09 08:00:28 crc kubenswrapper[4715]: W1009 08:00:28.154592 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9657b932_fe63_4417_8463_8af21e9c9790.slice/crio-8b3391af7559f5797b232899f093ca21e98a962f4693f3af05602295ad3a38c2 WatchSource:0}: Error finding container 8b3391af7559f5797b232899f093ca21e98a962f4693f3af05602295ad3a38c2: Status 404 returned error can't find the container with id 8b3391af7559f5797b232899f093ca21e98a962f4693f3af05602295ad3a38c2 Oct 09 08:00:28 crc kubenswrapper[4715]: W1009 08:00:28.157831 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode11fc796_233e_4c17_b953_1c6211f0c679.slice/crio-0118c7f7d643d227af2e00484d3e3ea61fd4a25e884564e21f35fed81df9da58 WatchSource:0}: Error finding container 0118c7f7d643d227af2e00484d3e3ea61fd4a25e884564e21f35fed81df9da58: Status 404 returned error can't find the container with id 0118c7f7d643d227af2e00484d3e3ea61fd4a25e884564e21f35fed81df9da58 Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.161316 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl"] Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.161352 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w"] Oct 09 08:00:28 crc kubenswrapper[4715]: W1009 08:00:28.178054 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2730bf5c_42b9_4739_a2bc_6250bfcb997a.slice/crio-7d046555ae071fa93f85cf0f903135ee3c9bad8fc62841a7870123b8efcc6bba WatchSource:0}: Error finding container 7d046555ae071fa93f85cf0f903135ee3c9bad8fc62841a7870123b8efcc6bba: Status 404 returned error can't find the container with id 7d046555ae071fa93f85cf0f903135ee3c9bad8fc62841a7870123b8efcc6bba Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.262571 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4"] Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.269869 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk"] Oct 09 08:00:28 crc kubenswrapper[4715]: W1009 08:00:28.273542 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc990a4aa_4a8e_499b_bf58_99c469af523e.slice/crio-a720f44816810c20775c48a45ae7f1b791720dc255d7fa2a9eb516f4f81d8cb1 WatchSource:0}: Error finding container a720f44816810c20775c48a45ae7f1b791720dc255d7fa2a9eb516f4f81d8cb1: Status 404 returned error can't find the container with id a720f44816810c20775c48a45ae7f1b791720dc255d7fa2a9eb516f4f81d8cb1 Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.286318 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b39f3e52-f97a-4bf4-934d-88267bddae91-cert\") pod \"infra-operator-controller-manager-585fc5b659-ckx8h\" (UID: \"b39f3e52-f97a-4bf4-934d-88267bddae91\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.293671 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5"] Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.305322 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b39f3e52-f97a-4bf4-934d-88267bddae91-cert\") pod \"infra-operator-controller-manager-585fc5b659-ckx8h\" (UID: \"b39f3e52-f97a-4bf4-934d-88267bddae91\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.430002 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.450813 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-6zczp"] Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.456766 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2"] Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.475716 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb"] Oct 09 08:00:28 crc kubenswrapper[4715]: W1009 08:00:28.483760 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad178d55_a5d5_40b5_9364_0a9af0718f46.slice/crio-2aea627dc8ff610b9b44db0808247ea766428870b3f54cecbf050590f22ad805 WatchSource:0}: Error finding container 2aea627dc8ff610b9b44db0808247ea766428870b3f54cecbf050590f22ad805: Status 404 returned error can't find the container with id 2aea627dc8ff610b9b44db0808247ea766428870b3f54cecbf050590f22ad805 Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.595527 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w" event={"ID":"2730bf5c-42b9-4739-a2bc-6250bfcb997a","Type":"ContainerStarted","Data":"7d046555ae071fa93f85cf0f903135ee3c9bad8fc62841a7870123b8efcc6bba"} Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.596655 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2" event={"ID":"ecf88dec-957f-4221-8ded-d779392c2793","Type":"ContainerStarted","Data":"1f05e688136d9f17cecc7243e359730a016f2ca9f0cebcb057f8dd59202dd041"} Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.597765 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4" event={"ID":"c990a4aa-4a8e-499b-bf58-99c469af523e","Type":"ContainerStarted","Data":"a720f44816810c20775c48a45ae7f1b791720dc255d7fa2a9eb516f4f81d8cb1"} Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.598563 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl" event={"ID":"9657b932-fe63-4417-8463-8af21e9c9790","Type":"ContainerStarted","Data":"8b3391af7559f5797b232899f093ca21e98a962f4693f3af05602295ad3a38c2"} Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.599272 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4" event={"ID":"68110204-494d-4a10-b25d-0996c9dd1c6f","Type":"ContainerStarted","Data":"14d08517fd3b63b3c919517bb7a57c0cba606c4675ba2d47e3d897cb26065474"} Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.600120 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk" event={"ID":"32b6325f-e041-492d-a113-638dcef15310","Type":"ContainerStarted","Data":"ce8cbd2c05929871185455e44dc9e4254490b6c4c929c19f7ce68d376050fead"} Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.603558 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5" event={"ID":"675a8b37-dcfc-414e-9218-7741ce9ec2d5","Type":"ContainerStarted","Data":"464b801bfd598d70fa5e9d11a5b3fedb7120f1790fbb766ddc4ca9a212fe5298"} Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.604516 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb" event={"ID":"ad178d55-a5d5-40b5-9364-0a9af0718f46","Type":"ContainerStarted","Data":"2aea627dc8ff610b9b44db0808247ea766428870b3f54cecbf050590f22ad805"} Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.605731 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" event={"ID":"9cae911a-5b69-4cf4-aa26-4adb4457eec4","Type":"ContainerStarted","Data":"655f847d9093ed4e8a5c84072f93e93f9be4a1d6273ceacfcb6d5e18ebc3b197"} Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.607028 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2" event={"ID":"e11fc796-233e-4c17-b953-1c6211f0c679","Type":"ContainerStarted","Data":"0118c7f7d643d227af2e00484d3e3ea61fd4a25e884564e21f35fed81df9da58"} Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.693562 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9335457-1cad-453a-9539-d73dc2c77021-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll\" (UID: \"e9335457-1cad-453a-9539-d73dc2c77021\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.699322 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9335457-1cad-453a-9539-d73dc2c77021-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll\" (UID: \"e9335457-1cad-453a-9539-d73dc2c77021\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.837933 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.851344 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8"] Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.857340 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27"] Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.863580 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn"] Oct 09 08:00:28 crc kubenswrapper[4715]: W1009 08:00:28.867120 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d11d372_6981_432f_a2b0_364cb9b24f63.slice/crio-c0285d887c4ff428d95628d6763e5971628f24311db8252366867a4eb9caf84d WatchSource:0}: Error finding container c0285d887c4ff428d95628d6763e5971628f24311db8252366867a4eb9caf84d: Status 404 returned error can't find the container with id c0285d887c4ff428d95628d6763e5971628f24311db8252366867a4eb9caf84d Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.869143 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm"] Oct 09 08:00:28 crc kubenswrapper[4715]: W1009 08:00:28.872215 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod666e7073_bf77_46f3_99da_5ad2013835a9.slice/crio-49bf1c127f98ef076ca551254bcaf69e19d51f072e96a55c10f09aeb795eff86 WatchSource:0}: Error finding container 49bf1c127f98ef076ca551254bcaf69e19d51f072e96a55c10f09aeb795eff86: Status 404 returned error can't find the container with id 49bf1c127f98ef076ca551254bcaf69e19d51f072e96a55c10f09aeb795eff86 Oct 09 08:00:28 crc kubenswrapper[4715]: W1009 08:00:28.872745 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d1ea812_36f3_4478_9b78_aed194390313.slice/crio-541c2893e1337c002b47d69cb7c85e7d60d4f01f9e9d03cfe527d1e55fb131a8 WatchSource:0}: Error finding container 541c2893e1337c002b47d69cb7c85e7d60d4f01f9e9d03cfe527d1e55fb131a8: Status 404 returned error can't find the container with id 541c2893e1337c002b47d69cb7c85e7d60d4f01f9e9d03cfe527d1e55fb131a8 Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.875316 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt"] Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.880412 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q"] Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.885519 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h"] Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.890448 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r"] Oct 09 08:00:28 crc kubenswrapper[4715]: E1009 08:00:28.911999 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nd2w5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-664664cb68-xwt7q_openstack-operators(fc37f3a9-94a5-4957-939a-a0b0a7a567bb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 08:00:28 crc kubenswrapper[4715]: E1009 08:00:28.916456 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:551b59e107c9812f7ad7aa06577376b0dcb58ff9498a41d5d5273e60e20ba7e4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sh2qt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-6f96f8c84-5plxn_openstack-operators(6d1ea812-36f3-4478-9b78-aed194390313): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 08:00:28 crc kubenswrapper[4715]: W1009 08:00:28.937110 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b8010cb_d8af_4b7c_9530_fe143bbf1ddb.slice/crio-f0cb6e3c28ae5d6fc457a151963ffba436a2359a54abe8cd0fa0edbf2206ea07 WatchSource:0}: Error finding container f0cb6e3c28ae5d6fc457a151963ffba436a2359a54abe8cd0fa0edbf2206ea07: Status 404 returned error can't find the container with id f0cb6e3c28ae5d6fc457a151963ffba436a2359a54abe8cd0fa0edbf2206ea07 Oct 09 08:00:28 crc kubenswrapper[4715]: E1009 08:00:28.941552 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5jfz4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f4d5dfdc6-rcm5h_openstack-operators(4b8010cb-d8af-4b7c-9530-fe143bbf1ddb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 08:00:28 crc kubenswrapper[4715]: E1009 08:00:28.943406 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nz52q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6d7c7ddf95-xmq4r_openstack-operators(619ad411-d5d7-431b-9bb6-6cf084134aaf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.968544 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s"] Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.973803 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h"] Oct 09 08:00:28 crc kubenswrapper[4715]: W1009 08:00:28.982633 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb39f3e52_f97a_4bf4_934d_88267bddae91.slice/crio-e33454e5bb3608d543862a7fb5e4e3fe803cfc604c0244f3b47e67ce3d72eb0b WatchSource:0}: Error finding container e33454e5bb3608d543862a7fb5e4e3fe803cfc604c0244f3b47e67ce3d72eb0b: Status 404 returned error can't find the container with id e33454e5bb3608d543862a7fb5e4e3fe803cfc604c0244f3b47e67ce3d72eb0b Oct 09 08:00:28 crc kubenswrapper[4715]: W1009 08:00:28.988622 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48619024_da5f_4b28_8724_3707961de8ce.slice/crio-9f6629a51261b25d8ed9b5cf91264e729be336474bafe3b4f43bf8d490744757 WatchSource:0}: Error finding container 9f6629a51261b25d8ed9b5cf91264e729be336474bafe3b4f43bf8d490744757: Status 404 returned error can't find the container with id 9f6629a51261b25d8ed9b5cf91264e729be336474bafe3b4f43bf8d490744757 Oct 09 08:00:28 crc kubenswrapper[4715]: I1009 08:00:28.998513 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd59cd6f-8b57-4377-80ae-a1873494f103-cert\") pod \"openstack-operator-controller-manager-5b6844c9b7-z7vpp\" (UID: \"fd59cd6f-8b57-4377-80ae-a1873494f103\") " pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" Oct 09 08:00:29 crc kubenswrapper[4715]: E1009 08:00:29.007207 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-84l75,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s_openstack-operators(48619024-da5f-4b28-8724-3707961de8ce): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.007750 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd59cd6f-8b57-4377-80ae-a1873494f103-cert\") pod \"openstack-operator-controller-manager-5b6844c9b7-z7vpp\" (UID: \"fd59cd6f-8b57-4377-80ae-a1873494f103\") " pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" Oct 09 08:00:29 crc kubenswrapper[4715]: E1009 08:00:29.008391 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s" podUID="48619024-da5f-4b28-8724-3707961de8ce" Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.205950 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.494355 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll"] Oct 09 08:00:29 crc kubenswrapper[4715]: W1009 08:00:29.500523 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9335457_1cad_453a_9539_d73dc2c77021.slice/crio-81e890ea8c6dd5666413a0cb762078b12dc64ee86d09defa47f53a412ac71ce4 WatchSource:0}: Error finding container 81e890ea8c6dd5666413a0cb762078b12dc64ee86d09defa47f53a412ac71ce4: Status 404 returned error can't find the container with id 81e890ea8c6dd5666413a0cb762078b12dc64ee86d09defa47f53a412ac71ce4 Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.626574 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" event={"ID":"fc37f3a9-94a5-4957-939a-a0b0a7a567bb","Type":"ContainerStarted","Data":"b1647ff8bb1a836a296a355fcd737c66fda7841d726a7d5f58fe37d9bc0dbf8c"} Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.631967 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" event={"ID":"6d11d372-6981-432f-a2b0-364cb9b24f63","Type":"ContainerStarted","Data":"c0285d887c4ff428d95628d6763e5971628f24311db8252366867a4eb9caf84d"} Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.633141 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" event={"ID":"ba259ec1-9157-4cd9-8c21-11915efe5dde","Type":"ContainerStarted","Data":"ddfa522fcc49cca30afc2bcf4548e03b02bcd0dbae2d97eeb815404771d0b363"} Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.644334 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" event={"ID":"619ad411-d5d7-431b-9bb6-6cf084134aaf","Type":"ContainerStarted","Data":"cef966687c02d749cca75ebdabb0db842b77209b807a9a98fedc366f3e505d91"} Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.650666 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" event={"ID":"4b8010cb-d8af-4b7c-9530-fe143bbf1ddb","Type":"ContainerStarted","Data":"f0cb6e3c28ae5d6fc457a151963ffba436a2359a54abe8cd0fa0edbf2206ea07"} Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.654973 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" event={"ID":"e9335457-1cad-453a-9539-d73dc2c77021","Type":"ContainerStarted","Data":"81e890ea8c6dd5666413a0cb762078b12dc64ee86d09defa47f53a412ac71ce4"} Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.665678 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8" event={"ID":"666e7073-bf77-46f3-99da-5ad2013835a9","Type":"ContainerStarted","Data":"49bf1c127f98ef076ca551254bcaf69e19d51f072e96a55c10f09aeb795eff86"} Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.681671 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm" event={"ID":"9b402da9-cbb2-473b-beee-7064e06acb73","Type":"ContainerStarted","Data":"5b7bed35bb4b05012f192763c924e812157d7466a88ee9784def1a55a74df85b"} Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.694817 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s" event={"ID":"48619024-da5f-4b28-8724-3707961de8ce","Type":"ContainerStarted","Data":"9f6629a51261b25d8ed9b5cf91264e729be336474bafe3b4f43bf8d490744757"} Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.697961 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp"] Oct 09 08:00:29 crc kubenswrapper[4715]: E1009 08:00:29.699804 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s" podUID="48619024-da5f-4b28-8724-3707961de8ce" Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.702236 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" event={"ID":"b39f3e52-f97a-4bf4-934d-88267bddae91","Type":"ContainerStarted","Data":"e33454e5bb3608d543862a7fb5e4e3fe803cfc604c0244f3b47e67ce3d72eb0b"} Oct 09 08:00:29 crc kubenswrapper[4715]: I1009 08:00:29.708154 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" event={"ID":"6d1ea812-36f3-4478-9b78-aed194390313","Type":"ContainerStarted","Data":"541c2893e1337c002b47d69cb7c85e7d60d4f01f9e9d03cfe527d1e55fb131a8"} Oct 09 08:00:29 crc kubenswrapper[4715]: E1009 08:00:29.932146 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" podUID="fc37f3a9-94a5-4957-939a-a0b0a7a567bb" Oct 09 08:00:29 crc kubenswrapper[4715]: E1009 08:00:29.934389 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" podUID="619ad411-d5d7-431b-9bb6-6cf084134aaf" Oct 09 08:00:29 crc kubenswrapper[4715]: E1009 08:00:29.946155 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" podUID="6d1ea812-36f3-4478-9b78-aed194390313" Oct 09 08:00:29 crc kubenswrapper[4715]: E1009 08:00:29.955990 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" podUID="4b8010cb-d8af-4b7c-9530-fe143bbf1ddb" Oct 09 08:00:30 crc kubenswrapper[4715]: I1009 08:00:30.798634 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxpbc" event={"ID":"8fefbe0c-de73-415c-a42f-77742a8afab2","Type":"ContainerStarted","Data":"58966667b463a08bf7ffe50acfb91f4787de5090fb3db0649d953600f415cf65"} Oct 09 08:00:30 crc kubenswrapper[4715]: I1009 08:00:30.833135 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" event={"ID":"fd59cd6f-8b57-4377-80ae-a1873494f103","Type":"ContainerStarted","Data":"07196e47f26dd591435ab730e11809ce77e9738a546dce92da97c2204a7a9f31"} Oct 09 08:00:30 crc kubenswrapper[4715]: I1009 08:00:30.833598 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" event={"ID":"fd59cd6f-8b57-4377-80ae-a1873494f103","Type":"ContainerStarted","Data":"84e849b196ffeb7ff333b5d15cf9ff1c55e68491a3bcd34ac6a396826a7d20af"} Oct 09 08:00:30 crc kubenswrapper[4715]: I1009 08:00:30.833646 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" event={"ID":"fd59cd6f-8b57-4377-80ae-a1873494f103","Type":"ContainerStarted","Data":"84ea2e692e9bfe8f5fc21cf030c84dfe6382b316e0ed341803951d0b3dba7950"} Oct 09 08:00:30 crc kubenswrapper[4715]: I1009 08:00:30.833674 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" Oct 09 08:00:30 crc kubenswrapper[4715]: I1009 08:00:30.868735 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" event={"ID":"619ad411-d5d7-431b-9bb6-6cf084134aaf","Type":"ContainerStarted","Data":"cd88bb6f3df8f8c6dbe785863df51d908ed82dec519611a98bbf259e3dd635bc"} Oct 09 08:00:30 crc kubenswrapper[4715]: E1009 08:00:30.878629 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" podUID="619ad411-d5d7-431b-9bb6-6cf084134aaf" Oct 09 08:00:30 crc kubenswrapper[4715]: I1009 08:00:30.902355 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" event={"ID":"6d1ea812-36f3-4478-9b78-aed194390313","Type":"ContainerStarted","Data":"31c2cc929921293e3b9a7ba151c16cf09e2f7b99ff7952266ad6d75336f69601"} Oct 09 08:00:30 crc kubenswrapper[4715]: E1009 08:00:30.909320 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:551b59e107c9812f7ad7aa06577376b0dcb58ff9498a41d5d5273e60e20ba7e4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" podUID="6d1ea812-36f3-4478-9b78-aed194390313" Oct 09 08:00:30 crc kubenswrapper[4715]: I1009 08:00:30.924631 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" event={"ID":"fc37f3a9-94a5-4957-939a-a0b0a7a567bb","Type":"ContainerStarted","Data":"ed85ac1104fa0178242a5980a7e27cca0d69f1815d53c33b73ca4bcb772dec41"} Oct 09 08:00:30 crc kubenswrapper[4715]: E1009 08:00:30.936759 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" podUID="fc37f3a9-94a5-4957-939a-a0b0a7a567bb" Oct 09 08:00:30 crc kubenswrapper[4715]: I1009 08:00:30.950874 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" event={"ID":"4b8010cb-d8af-4b7c-9530-fe143bbf1ddb","Type":"ContainerStarted","Data":"6cc289019c68ef34b5fa1f3c94bd60365a26bf3f313d653ab3c33af4cbe1060d"} Oct 09 08:00:30 crc kubenswrapper[4715]: E1009 08:00:30.955989 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s" podUID="48619024-da5f-4b28-8724-3707961de8ce" Oct 09 08:00:30 crc kubenswrapper[4715]: E1009 08:00:30.958476 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" podUID="4b8010cb-d8af-4b7c-9530-fe143bbf1ddb" Oct 09 08:00:31 crc kubenswrapper[4715]: I1009 08:00:31.004906 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sxpbc" podStartSLOduration=2.4337305799999998 podStartE2EDuration="9.004883472s" podCreationTimestamp="2025-10-09 08:00:22 +0000 UTC" firstStartedPulling="2025-10-09 08:00:23.539922913 +0000 UTC m=+854.232726921" lastFinishedPulling="2025-10-09 08:00:30.111075805 +0000 UTC m=+860.803879813" observedRunningTime="2025-10-09 08:00:30.994488909 +0000 UTC m=+861.687292917" watchObservedRunningTime="2025-10-09 08:00:31.004883472 +0000 UTC m=+861.697687480" Oct 09 08:00:31 crc kubenswrapper[4715]: I1009 08:00:31.152256 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" podStartSLOduration=4.15223058 podStartE2EDuration="4.15223058s" podCreationTimestamp="2025-10-09 08:00:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:00:31.125634254 +0000 UTC m=+861.818438262" watchObservedRunningTime="2025-10-09 08:00:31.15223058 +0000 UTC m=+861.845034588" Oct 09 08:00:31 crc kubenswrapper[4715]: E1009 08:00:31.979397 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" podUID="fc37f3a9-94a5-4957-939a-a0b0a7a567bb" Oct 09 08:00:31 crc kubenswrapper[4715]: E1009 08:00:31.980172 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:551b59e107c9812f7ad7aa06577376b0dcb58ff9498a41d5d5273e60e20ba7e4\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" podUID="6d1ea812-36f3-4478-9b78-aed194390313" Oct 09 08:00:31 crc kubenswrapper[4715]: E1009 08:00:31.980256 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" podUID="4b8010cb-d8af-4b7c-9530-fe143bbf1ddb" Oct 09 08:00:31 crc kubenswrapper[4715]: E1009 08:00:31.980334 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:09deecf840d38ff6af3c924729cf0a9444bc985848bfbe7c918019b88a6bc4d7\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" podUID="619ad411-d5d7-431b-9bb6-6cf084134aaf" Oct 09 08:00:32 crc kubenswrapper[4715]: I1009 08:00:32.845808 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:32 crc kubenswrapper[4715]: I1009 08:00:32.846122 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:32 crc kubenswrapper[4715]: I1009 08:00:32.940577 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:39 crc kubenswrapper[4715]: I1009 08:00:39.217303 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5b6844c9b7-z7vpp" Oct 09 08:00:42 crc kubenswrapper[4715]: E1009 08:00:42.557217 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351" Oct 09 08:00:42 crc kubenswrapper[4715]: E1009 08:00:42.558197 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sxqv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll_openstack-operators(e9335457-1cad-453a-9539-d73dc2c77021): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:00:42 crc kubenswrapper[4715]: I1009 08:00:42.886273 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:42 crc kubenswrapper[4715]: I1009 08:00:42.931491 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxpbc"] Oct 09 08:00:43 crc kubenswrapper[4715]: I1009 08:00:43.060605 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sxpbc" podUID="8fefbe0c-de73-415c-a42f-77742a8afab2" containerName="registry-server" containerID="cri-o://58966667b463a08bf7ffe50acfb91f4787de5090fb3db0649d953600f415cf65" gracePeriod=2 Oct 09 08:00:43 crc kubenswrapper[4715]: E1009 08:00:43.110016 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd" Oct 09 08:00:43 crc kubenswrapper[4715]: E1009 08:00:43.110230 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8dqfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-57bb74c7bf-6xd27_openstack-operators(6d11d372-6981-432f-a2b0-364cb9b24f63): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:00:43 crc kubenswrapper[4715]: E1009 08:00:43.741719 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a" Oct 09 08:00:43 crc kubenswrapper[4715]: E1009 08:00:43.742286 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wmg56,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-ffcdd6c94-rt6nt_openstack-operators(ba259ec1-9157-4cd9-8c21-11915efe5dde): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:00:44 crc kubenswrapper[4715]: I1009 08:00:44.069246 4715 generic.go:334] "Generic (PLEG): container finished" podID="8fefbe0c-de73-415c-a42f-77742a8afab2" containerID="58966667b463a08bf7ffe50acfb91f4787de5090fb3db0649d953600f415cf65" exitCode=0 Oct 09 08:00:44 crc kubenswrapper[4715]: I1009 08:00:44.069290 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxpbc" event={"ID":"8fefbe0c-de73-415c-a42f-77742a8afab2","Type":"ContainerDied","Data":"58966667b463a08bf7ffe50acfb91f4787de5090fb3db0649d953600f415cf65"} Oct 09 08:00:44 crc kubenswrapper[4715]: I1009 08:00:44.376478 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:44 crc kubenswrapper[4715]: E1009 08:00:44.399504 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8" Oct 09 08:00:44 crc kubenswrapper[4715]: E1009 08:00:44.399754 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jcmz4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-59578bc799-6zczp_openstack-operators(9cae911a-5b69-4cf4-aa26-4adb4457eec4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:00:44 crc kubenswrapper[4715]: I1009 08:00:44.495139 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fefbe0c-de73-415c-a42f-77742a8afab2-catalog-content\") pod \"8fefbe0c-de73-415c-a42f-77742a8afab2\" (UID: \"8fefbe0c-de73-415c-a42f-77742a8afab2\") " Oct 09 08:00:44 crc kubenswrapper[4715]: I1009 08:00:44.495271 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fefbe0c-de73-415c-a42f-77742a8afab2-utilities\") pod \"8fefbe0c-de73-415c-a42f-77742a8afab2\" (UID: \"8fefbe0c-de73-415c-a42f-77742a8afab2\") " Oct 09 08:00:44 crc kubenswrapper[4715]: I1009 08:00:44.495312 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp2gm\" (UniqueName: \"kubernetes.io/projected/8fefbe0c-de73-415c-a42f-77742a8afab2-kube-api-access-jp2gm\") pod \"8fefbe0c-de73-415c-a42f-77742a8afab2\" (UID: \"8fefbe0c-de73-415c-a42f-77742a8afab2\") " Oct 09 08:00:44 crc kubenswrapper[4715]: I1009 08:00:44.496284 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fefbe0c-de73-415c-a42f-77742a8afab2-utilities" (OuterVolumeSpecName: "utilities") pod "8fefbe0c-de73-415c-a42f-77742a8afab2" (UID: "8fefbe0c-de73-415c-a42f-77742a8afab2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:00:44 crc kubenswrapper[4715]: I1009 08:00:44.501277 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fefbe0c-de73-415c-a42f-77742a8afab2-kube-api-access-jp2gm" (OuterVolumeSpecName: "kube-api-access-jp2gm") pod "8fefbe0c-de73-415c-a42f-77742a8afab2" (UID: "8fefbe0c-de73-415c-a42f-77742a8afab2"). InnerVolumeSpecName "kube-api-access-jp2gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:00:44 crc kubenswrapper[4715]: I1009 08:00:44.519943 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fefbe0c-de73-415c-a42f-77742a8afab2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fefbe0c-de73-415c-a42f-77742a8afab2" (UID: "8fefbe0c-de73-415c-a42f-77742a8afab2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:00:44 crc kubenswrapper[4715]: I1009 08:00:44.597777 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fefbe0c-de73-415c-a42f-77742a8afab2-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:00:44 crc kubenswrapper[4715]: I1009 08:00:44.597807 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp2gm\" (UniqueName: \"kubernetes.io/projected/8fefbe0c-de73-415c-a42f-77742a8afab2-kube-api-access-jp2gm\") on node \"crc\" DevicePath \"\"" Oct 09 08:00:44 crc kubenswrapper[4715]: I1009 08:00:44.597820 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fefbe0c-de73-415c-a42f-77742a8afab2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:00:45 crc kubenswrapper[4715]: I1009 08:00:45.080711 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxpbc" event={"ID":"8fefbe0c-de73-415c-a42f-77742a8afab2","Type":"ContainerDied","Data":"95b460695b69283a152ffd32cc9c5cd6c07eb50dd396d9b5b4d94dfd33bd5357"} Oct 09 08:00:45 crc kubenswrapper[4715]: I1009 08:00:45.081650 4715 scope.go:117] "RemoveContainer" containerID="58966667b463a08bf7ffe50acfb91f4787de5090fb3db0649d953600f415cf65" Oct 09 08:00:45 crc kubenswrapper[4715]: I1009 08:00:45.080974 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxpbc" Oct 09 08:00:45 crc kubenswrapper[4715]: I1009 08:00:45.113873 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxpbc"] Oct 09 08:00:45 crc kubenswrapper[4715]: I1009 08:00:45.118792 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxpbc"] Oct 09 08:00:45 crc kubenswrapper[4715]: I1009 08:00:45.148276 4715 scope.go:117] "RemoveContainer" containerID="bd3a9032a83368791b9f9dba116e2a08acd8f39a40aa9224e051e7d070e79e3e" Oct 09 08:00:45 crc kubenswrapper[4715]: I1009 08:00:45.267740 4715 scope.go:117] "RemoveContainer" containerID="0a9457d18a0f2331b6fd8264530ec35a667ad4f04cc4e7aea2cf59657d325a8f" Oct 09 08:00:45 crc kubenswrapper[4715]: E1009 08:00:45.399728 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" podUID="9cae911a-5b69-4cf4-aa26-4adb4457eec4" Oct 09 08:00:45 crc kubenswrapper[4715]: E1009 08:00:45.400381 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" podUID="e9335457-1cad-453a-9539-d73dc2c77021" Oct 09 08:00:45 crc kubenswrapper[4715]: E1009 08:00:45.455556 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" podUID="6d11d372-6981-432f-a2b0-364cb9b24f63" Oct 09 08:00:45 crc kubenswrapper[4715]: E1009 08:00:45.457269 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" podUID="ba259ec1-9157-4cd9-8c21-11915efe5dde" Oct 09 08:00:46 crc kubenswrapper[4715]: E1009 08:00:46.157572 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" podUID="9cae911a-5b69-4cf4-aa26-4adb4457eec4" Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.167187 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fefbe0c-de73-415c-a42f-77742a8afab2" path="/var/lib/kubelet/pods/8fefbe0c-de73-415c-a42f-77742a8afab2/volumes" Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.167925 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" event={"ID":"9cae911a-5b69-4cf4-aa26-4adb4457eec4","Type":"ContainerStarted","Data":"d43cac281807c6d60663c671ab61e0736d7c7f89136647510a753251a830cf92"} Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.171143 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" event={"ID":"e9335457-1cad-453a-9539-d73dc2c77021","Type":"ContainerStarted","Data":"00fcf42eab6162d6227ca6ac9c1f8a195a97ad0301c8b47cc87ba5e7d7f775c0"} Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.197393 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" event={"ID":"6d11d372-6981-432f-a2b0-364cb9b24f63","Type":"ContainerStarted","Data":"277fd44fecfbb0c8e4e4af7122940c74c2247330eeda175c91c70068885b8a9a"} Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.206473 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk" event={"ID":"32b6325f-e041-492d-a113-638dcef15310","Type":"ContainerStarted","Data":"1d4996e22f63848201b3ca708582f541b58fb89da30583d1b5df989fb86f96bc"} Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.210136 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb" event={"ID":"ad178d55-a5d5-40b5-9364-0a9af0718f46","Type":"ContainerStarted","Data":"dae99d5146f9613c3e5928c737779abd4cb71dab983bc80395f72e34c676d1ee"} Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.212057 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl" event={"ID":"9657b932-fe63-4417-8463-8af21e9c9790","Type":"ContainerStarted","Data":"d0e929bdd86ceb90e15f761eccef950f651041134b62c3b8dff386000e2cbed1"} Oct 09 08:00:46 crc kubenswrapper[4715]: E1009 08:00:46.212406 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" podUID="e9335457-1cad-453a-9539-d73dc2c77021" Oct 09 08:00:46 crc kubenswrapper[4715]: E1009 08:00:46.212592 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" podUID="6d11d372-6981-432f-a2b0-364cb9b24f63" Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.217759 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2" event={"ID":"e11fc796-233e-4c17-b953-1c6211f0c679","Type":"ContainerStarted","Data":"5382056c5a063420781a9f1ff08a45d449407a3b1b129493ae96426b1b9bb210"} Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.242204 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2" event={"ID":"ecf88dec-957f-4221-8ded-d779392c2793","Type":"ContainerStarted","Data":"da56a8db27b9aebf4f9d5e36ce17e925cf6f71ce63a85b19123e4527b44133db"} Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.260449 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4" event={"ID":"c990a4aa-4a8e-499b-bf58-99c469af523e","Type":"ContainerStarted","Data":"6fda784a80160619339099490937914f597e63594237c68f3c298610147490fb"} Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.269808 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" event={"ID":"b39f3e52-f97a-4bf4-934d-88267bddae91","Type":"ContainerStarted","Data":"95565c4d5206429c284f202cf1d6788a945403d0506488606d65f2b7e3db9417"} Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.284622 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w" event={"ID":"2730bf5c-42b9-4739-a2bc-6250bfcb997a","Type":"ContainerStarted","Data":"265c888d8d2b2b9a6ed673098d328b586579dc47131cd13c3924a2d4c78b4efc"} Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.317135 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" event={"ID":"ba259ec1-9157-4cd9-8c21-11915efe5dde","Type":"ContainerStarted","Data":"aaf94e25650d70f8e8580aab6c3f072782ccdce4acedc9854d3ff742b73f7c7c"} Oct 09 08:00:46 crc kubenswrapper[4715]: E1009 08:00:46.329556 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" podUID="ba259ec1-9157-4cd9-8c21-11915efe5dde" Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.353225 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8" event={"ID":"666e7073-bf77-46f3-99da-5ad2013835a9","Type":"ContainerStarted","Data":"5adbde65bd848ea381ea60a7d4d6c50cc7bb58dfd3cf6e7c00d81fa4708af731"} Oct 09 08:00:46 crc kubenswrapper[4715]: I1009 08:00:46.370410 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm" event={"ID":"9b402da9-cbb2-473b-beee-7064e06acb73","Type":"ContainerStarted","Data":"cfc105bb8d5eb9e54783c92451a9a0cbc10864e347a4a85f67296560c41bd13e"} Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.384352 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg" event={"ID":"e4603d13-cf9d-4d8d-82db-3b182aa42e74","Type":"ContainerStarted","Data":"ebc7dd6083fa4a0547a404b99a15dc4cdc393c6113e1510ed1a226de234cce71"} Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.387474 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" event={"ID":"6d1ea812-36f3-4478-9b78-aed194390313","Type":"ContainerStarted","Data":"de50673c1c6918a1c111a3c4ac1d9e52dac4245f7f36490604793eb2248a5989"} Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.387676 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.395339 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2" event={"ID":"e11fc796-233e-4c17-b953-1c6211f0c679","Type":"ContainerStarted","Data":"25e0137eaf0ba23e4a8d7f65bc339d55a9b53da0da8446364bf19621a6746407"} Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.396361 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.399049 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2" event={"ID":"ecf88dec-957f-4221-8ded-d779392c2793","Type":"ContainerStarted","Data":"1fc8b6841d6bd88858977679a26ddc938829bc68f3b75b9db5ad1ac53eb0e391"} Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.399506 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.409872 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" podStartSLOduration=5.030466678 podStartE2EDuration="21.409846941s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.91627478 +0000 UTC m=+859.609078788" lastFinishedPulling="2025-10-09 08:00:45.295655043 +0000 UTC m=+875.988459051" observedRunningTime="2025-10-09 08:00:47.409231743 +0000 UTC m=+878.102035751" watchObservedRunningTime="2025-10-09 08:00:47.409846941 +0000 UTC m=+878.102650959" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.411030 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm" event={"ID":"9b402da9-cbb2-473b-beee-7064e06acb73","Type":"ContainerStarted","Data":"ce30961b79c61e4a3d662e3896982d7e6363806e4b37c4f66e97d6e6fa01a033"} Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.412261 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.420813 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4" event={"ID":"c990a4aa-4a8e-499b-bf58-99c469af523e","Type":"ContainerStarted","Data":"874571dd9e811207883bf1876a49f5517e4c3db0789a36b61d5846ce6f2c3c51"} Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.420976 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.434800 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2" podStartSLOduration=4.956019338 podStartE2EDuration="21.434730927s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.465134673 +0000 UTC m=+859.157938681" lastFinishedPulling="2025-10-09 08:00:44.943846262 +0000 UTC m=+875.636650270" observedRunningTime="2025-10-09 08:00:47.427048483 +0000 UTC m=+878.119852511" watchObservedRunningTime="2025-10-09 08:00:47.434730927 +0000 UTC m=+878.127534965" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.438259 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb" event={"ID":"ad178d55-a5d5-40b5-9364-0a9af0718f46","Type":"ContainerStarted","Data":"4d19e893e825f833eb2888c400e3d193152200c8f0da95ac5407b2c83b4213cc"} Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.438546 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.440884 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl" event={"ID":"9657b932-fe63-4417-8463-8af21e9c9790","Type":"ContainerStarted","Data":"cb8fb1151f059810ecbaf305155f44feec7304f7ecc94ce8497ec0ced2d60dd3"} Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.441126 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.445061 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4" event={"ID":"68110204-494d-4a10-b25d-0996c9dd1c6f","Type":"ContainerStarted","Data":"b47992cb8f67fe967215c0bfdfb265a18d6ea58b31973d5bb2fda2d58bc35db6"} Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.447977 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w" event={"ID":"2730bf5c-42b9-4739-a2bc-6250bfcb997a","Type":"ContainerStarted","Data":"23c943ddb23d4495e743f3845f70f371ff5e09f855a8d726de13df256f8c9c0d"} Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.448156 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.453214 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5" event={"ID":"675a8b37-dcfc-414e-9218-7741ce9ec2d5","Type":"ContainerStarted","Data":"43fd3e687259b0c1e3a513b192ca18f85f76abd908a62eb5cc338743f16099b9"} Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.454543 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2" podStartSLOduration=4.676365502 podStartE2EDuration="21.454520004s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.159618713 +0000 UTC m=+858.852422721" lastFinishedPulling="2025-10-09 08:00:44.937773215 +0000 UTC m=+875.630577223" observedRunningTime="2025-10-09 08:00:47.450580419 +0000 UTC m=+878.143384447" watchObservedRunningTime="2025-10-09 08:00:47.454520004 +0000 UTC m=+878.147324012" Oct 09 08:00:47 crc kubenswrapper[4715]: E1009 08:00:47.457713 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" podUID="ba259ec1-9157-4cd9-8c21-11915efe5dde" Oct 09 08:00:47 crc kubenswrapper[4715]: E1009 08:00:47.457743 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:b2e9acf568a48c28cf2aed6012e432eeeb7d5f0eb11878fc91b62bc34cba10cd\\\"\"" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" podUID="6d11d372-6981-432f-a2b0-364cb9b24f63" Oct 09 08:00:47 crc kubenswrapper[4715]: E1009 08:00:47.457743 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" podUID="9cae911a-5b69-4cf4-aa26-4adb4457eec4" Oct 09 08:00:47 crc kubenswrapper[4715]: E1009 08:00:47.457806 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" podUID="e9335457-1cad-453a-9539-d73dc2c77021" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.480346 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w" podStartSLOduration=4.723535608 podStartE2EDuration="21.480308496s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.185808597 +0000 UTC m=+858.878612605" lastFinishedPulling="2025-10-09 08:00:44.942581485 +0000 UTC m=+875.635385493" observedRunningTime="2025-10-09 08:00:47.471798078 +0000 UTC m=+878.164602106" watchObservedRunningTime="2025-10-09 08:00:47.480308496 +0000 UTC m=+878.173112504" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.498634 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4" podStartSLOduration=4.791608193 podStartE2EDuration="21.49860541s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.275150432 +0000 UTC m=+858.967954430" lastFinishedPulling="2025-10-09 08:00:44.982147639 +0000 UTC m=+875.674951647" observedRunningTime="2025-10-09 08:00:47.489586317 +0000 UTC m=+878.182390335" watchObservedRunningTime="2025-10-09 08:00:47.49860541 +0000 UTC m=+878.191409418" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.508692 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl" podStartSLOduration=4.725156765 podStartE2EDuration="21.508666963s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.159209811 +0000 UTC m=+858.852013819" lastFinishedPulling="2025-10-09 08:00:44.942720009 +0000 UTC m=+875.635524017" observedRunningTime="2025-10-09 08:00:47.503373709 +0000 UTC m=+878.196177727" watchObservedRunningTime="2025-10-09 08:00:47.508666963 +0000 UTC m=+878.201470971" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.529106 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm" podStartSLOduration=5.473212601 podStartE2EDuration="21.529080028s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.887980675 +0000 UTC m=+859.580784683" lastFinishedPulling="2025-10-09 08:00:44.943848102 +0000 UTC m=+875.636652110" observedRunningTime="2025-10-09 08:00:47.521866278 +0000 UTC m=+878.214670286" watchObservedRunningTime="2025-10-09 08:00:47.529080028 +0000 UTC m=+878.221884036" Oct 09 08:00:47 crc kubenswrapper[4715]: I1009 08:00:47.545490 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb" podStartSLOduration=5.087662587 podStartE2EDuration="21.545467886s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.486084404 +0000 UTC m=+859.178888412" lastFinishedPulling="2025-10-09 08:00:44.943889703 +0000 UTC m=+875.636693711" observedRunningTime="2025-10-09 08:00:47.538980497 +0000 UTC m=+878.231784525" watchObservedRunningTime="2025-10-09 08:00:47.545467886 +0000 UTC m=+878.238271894" Oct 09 08:00:54 crc kubenswrapper[4715]: I1009 08:00:53.501458 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" event={"ID":"b39f3e52-f97a-4bf4-934d-88267bddae91","Type":"ContainerStarted","Data":"c1ca14ac0575db57bdd94fd48a0ba09d16f3375c76777fe101a87f7a8279395d"} Oct 09 08:00:54 crc kubenswrapper[4715]: I1009 08:00:53.502079 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" Oct 09 08:00:54 crc kubenswrapper[4715]: I1009 08:00:53.507219 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" Oct 09 08:00:54 crc kubenswrapper[4715]: I1009 08:00:53.525642 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-ckx8h" podStartSLOduration=11.571825603 podStartE2EDuration="27.525619574s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.984788648 +0000 UTC m=+859.677592656" lastFinishedPulling="2025-10-09 08:00:44.938582609 +0000 UTC m=+875.631386627" observedRunningTime="2025-10-09 08:00:53.518982211 +0000 UTC m=+884.211786219" watchObservedRunningTime="2025-10-09 08:00:53.525619574 +0000 UTC m=+884.218423582" Oct 09 08:00:55 crc kubenswrapper[4715]: I1009 08:00:55.528198 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk" event={"ID":"32b6325f-e041-492d-a113-638dcef15310","Type":"ContainerStarted","Data":"11d7729076c338cef51effe21426950acfa0cea41ac2b4678c9475d7a3c8416c"} Oct 09 08:00:55 crc kubenswrapper[4715]: I1009 08:00:55.528732 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk" Oct 09 08:00:55 crc kubenswrapper[4715]: I1009 08:00:55.529926 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8" event={"ID":"666e7073-bf77-46f3-99da-5ad2013835a9","Type":"ContainerStarted","Data":"cbf6bf8834a0d603bcb468f9ea1832ee643afcc652a6f038041d5beade3ed9b4"} Oct 09 08:00:55 crc kubenswrapper[4715]: I1009 08:00:55.530315 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8" Oct 09 08:00:55 crc kubenswrapper[4715]: I1009 08:00:55.530381 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk" Oct 09 08:00:55 crc kubenswrapper[4715]: I1009 08:00:55.533508 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8" Oct 09 08:00:55 crc kubenswrapper[4715]: I1009 08:00:55.554160 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-qjwrk" podStartSLOduration=12.881990763 podStartE2EDuration="29.554136544s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.273512234 +0000 UTC m=+858.966316242" lastFinishedPulling="2025-10-09 08:00:44.945658015 +0000 UTC m=+875.638462023" observedRunningTime="2025-10-09 08:00:55.545139662 +0000 UTC m=+886.237943670" watchObservedRunningTime="2025-10-09 08:00:55.554136544 +0000 UTC m=+886.246940552" Oct 09 08:00:55 crc kubenswrapper[4715]: I1009 08:00:55.590042 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-646675d848-bgwc8" podStartSLOduration=13.522302707 podStartE2EDuration="29.59001212s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.881500736 +0000 UTC m=+859.574304744" lastFinishedPulling="2025-10-09 08:00:44.949210159 +0000 UTC m=+875.642014157" observedRunningTime="2025-10-09 08:00:55.576578869 +0000 UTC m=+886.269382887" watchObservedRunningTime="2025-10-09 08:00:55.59001212 +0000 UTC m=+886.282816128" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.538396 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s" event={"ID":"48619024-da5f-4b28-8724-3707961de8ce","Type":"ContainerStarted","Data":"45a7326df3be10c4e6582275bac86bf085dbda4253903707710a21fbea6688c6"} Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.540202 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5" event={"ID":"675a8b37-dcfc-414e-9218-7741ce9ec2d5","Type":"ContainerStarted","Data":"7eac055129d7d06bbb5aad7e281bba2687f712084bf343bb9dd8c2d77f9e2d7e"} Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.540405 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.542066 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4" event={"ID":"68110204-494d-4a10-b25d-0996c9dd1c6f","Type":"ContainerStarted","Data":"bfbc16fccfc9b2a716843968032acc07d6c1916b13104c5657e6494536bab738"} Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.542334 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.543262 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.544027 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg" event={"ID":"e4603d13-cf9d-4d8d-82db-3b182aa42e74","Type":"ContainerStarted","Data":"c65a2f6b86e052d347c0a07c3d940a178d0d0a21ede50fed60f4006f2d3160f3"} Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.544222 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.545786 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" event={"ID":"619ad411-d5d7-431b-9bb6-6cf084134aaf","Type":"ContainerStarted","Data":"9cd6c49713c54f7d3bfe23fd2484fa2061713118c2408b670d225afb315b2f69"} Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.546193 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.546436 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.547224 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" event={"ID":"fc37f3a9-94a5-4957-939a-a0b0a7a567bb","Type":"ContainerStarted","Data":"3747aec8652b4729e3da991f5e94ac9c33bc01a7e744265b54e3bb97698e83c2"} Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.547745 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.551865 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" event={"ID":"4b8010cb-d8af-4b7c-9530-fe143bbf1ddb","Type":"ContainerStarted","Data":"2089f659d3ba46eac14ca9fc3c2a060c9d018f5b0f89f94ef8f21c8f57f3ad62"} Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.552362 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.554462 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.614378 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s" podStartSLOduration=2.518584517 podStartE2EDuration="29.614355345s" podCreationTimestamp="2025-10-09 08:00:27 +0000 UTC" firstStartedPulling="2025-10-09 08:00:29.006987386 +0000 UTC m=+859.699791394" lastFinishedPulling="2025-10-09 08:00:56.102758214 +0000 UTC m=+886.795562222" observedRunningTime="2025-10-09 08:00:56.57201983 +0000 UTC m=+887.264823838" watchObservedRunningTime="2025-10-09 08:00:56.614355345 +0000 UTC m=+887.307159353" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.615315 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-gqdw4" podStartSLOduration=13.302609291 podStartE2EDuration="30.615308753s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:27.629242345 +0000 UTC m=+858.322046353" lastFinishedPulling="2025-10-09 08:00:44.941941807 +0000 UTC m=+875.634745815" observedRunningTime="2025-10-09 08:00:56.606755833 +0000 UTC m=+887.299559861" watchObservedRunningTime="2025-10-09 08:00:56.615308753 +0000 UTC m=+887.308112751" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.685339 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" podStartSLOduration=3.523914061 podStartE2EDuration="30.685282053s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.941415293 +0000 UTC m=+859.634219301" lastFinishedPulling="2025-10-09 08:00:56.102783275 +0000 UTC m=+886.795587293" observedRunningTime="2025-10-09 08:00:56.646827702 +0000 UTC m=+887.339631710" watchObservedRunningTime="2025-10-09 08:00:56.685282053 +0000 UTC m=+887.378086061" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.687094 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" podStartSLOduration=3.528408143 podStartE2EDuration="30.687077386s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.943329619 +0000 UTC m=+859.636133617" lastFinishedPulling="2025-10-09 08:00:56.101998862 +0000 UTC m=+886.794802860" observedRunningTime="2025-10-09 08:00:56.683493421 +0000 UTC m=+887.376297429" watchObservedRunningTime="2025-10-09 08:00:56.687077386 +0000 UTC m=+887.379881394" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.707684 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-v8zt5" podStartSLOduration=14.091812647 podStartE2EDuration="30.707660246s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.328844508 +0000 UTC m=+859.021648516" lastFinishedPulling="2025-10-09 08:00:44.944692107 +0000 UTC m=+875.637496115" observedRunningTime="2025-10-09 08:00:56.706123291 +0000 UTC m=+887.398927319" watchObservedRunningTime="2025-10-09 08:00:56.707660246 +0000 UTC m=+887.400464254" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.732848 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" podStartSLOduration=3.530677479 podStartE2EDuration="30.73282738s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.911829711 +0000 UTC m=+859.604633719" lastFinishedPulling="2025-10-09 08:00:56.113979602 +0000 UTC m=+886.806783620" observedRunningTime="2025-10-09 08:00:56.729231255 +0000 UTC m=+887.422035283" watchObservedRunningTime="2025-10-09 08:00:56.73282738 +0000 UTC m=+887.425631388" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.760330 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-rsthg" podStartSLOduration=13.241763096 podStartE2EDuration="30.760304411s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:27.424002359 +0000 UTC m=+858.116806367" lastFinishedPulling="2025-10-09 08:00:44.942543674 +0000 UTC m=+875.635347682" observedRunningTime="2025-10-09 08:00:56.755895333 +0000 UTC m=+887.448699341" watchObservedRunningTime="2025-10-09 08:00:56.760304411 +0000 UTC m=+887.453108419" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.804250 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-cfkg2" Oct 09 08:00:56 crc kubenswrapper[4715]: I1009 08:00:56.872719 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-jps2w" Oct 09 08:00:57 crc kubenswrapper[4715]: I1009 08:00:57.008092 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-pxmc4" Oct 09 08:00:57 crc kubenswrapper[4715]: I1009 08:00:57.028616 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-7zmwl" Oct 09 08:00:57 crc kubenswrapper[4715]: I1009 08:00:57.189079 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-kqhg2" Oct 09 08:00:57 crc kubenswrapper[4715]: I1009 08:00:57.217680 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-ggwkb" Oct 09 08:00:57 crc kubenswrapper[4715]: I1009 08:00:57.369555 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6f96f8c84-5plxn" Oct 09 08:00:57 crc kubenswrapper[4715]: I1009 08:00:57.437966 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6648b66598-cvsqm" Oct 09 08:00:59 crc kubenswrapper[4715]: I1009 08:00:59.570915 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" event={"ID":"ba259ec1-9157-4cd9-8c21-11915efe5dde","Type":"ContainerStarted","Data":"309811c21936287cebf011f31866c55c57e36c399ccc0c79c6d8c2e45bb6008d"} Oct 09 08:00:59 crc kubenswrapper[4715]: I1009 08:00:59.571491 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" Oct 09 08:00:59 crc kubenswrapper[4715]: I1009 08:00:59.589390 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" podStartSLOduration=3.641321806 podStartE2EDuration="33.589370317s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.888065608 +0000 UTC m=+859.580869616" lastFinishedPulling="2025-10-09 08:00:58.836114119 +0000 UTC m=+889.528918127" observedRunningTime="2025-10-09 08:00:59.586985468 +0000 UTC m=+890.279789476" watchObservedRunningTime="2025-10-09 08:00:59.589370317 +0000 UTC m=+890.282174325" Oct 09 08:01:02 crc kubenswrapper[4715]: I1009 08:01:02.592559 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" event={"ID":"9cae911a-5b69-4cf4-aa26-4adb4457eec4","Type":"ContainerStarted","Data":"54a6bd22e5ab6c4f0b3ed442f6ccda26794ab8c50440d7ebc07ead941d416bbb"} Oct 09 08:01:02 crc kubenswrapper[4715]: I1009 08:01:02.593223 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" Oct 09 08:01:02 crc kubenswrapper[4715]: I1009 08:01:02.614547 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" podStartSLOduration=2.964386353 podStartE2EDuration="36.614525833s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.458396076 +0000 UTC m=+859.151200104" lastFinishedPulling="2025-10-09 08:01:02.108535576 +0000 UTC m=+892.801339584" observedRunningTime="2025-10-09 08:01:02.613995588 +0000 UTC m=+893.306799606" watchObservedRunningTime="2025-10-09 08:01:02.614525833 +0000 UTC m=+893.307329841" Oct 09 08:01:03 crc kubenswrapper[4715]: I1009 08:01:03.604001 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" event={"ID":"e9335457-1cad-453a-9539-d73dc2c77021","Type":"ContainerStarted","Data":"2c01a91d9576181e332ae629621f97b034b33528f76f37c4c752829e72610a72"} Oct 09 08:01:03 crc kubenswrapper[4715]: I1009 08:01:03.604491 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" Oct 09 08:01:03 crc kubenswrapper[4715]: I1009 08:01:03.641375 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" podStartSLOduration=4.282860584 podStartE2EDuration="37.641351659s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:29.502813915 +0000 UTC m=+860.195617923" lastFinishedPulling="2025-10-09 08:01:02.86130499 +0000 UTC m=+893.554108998" observedRunningTime="2025-10-09 08:01:03.640789163 +0000 UTC m=+894.333593201" watchObservedRunningTime="2025-10-09 08:01:03.641351659 +0000 UTC m=+894.334155677" Oct 09 08:01:04 crc kubenswrapper[4715]: I1009 08:01:04.612563 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" event={"ID":"6d11d372-6981-432f-a2b0-364cb9b24f63","Type":"ContainerStarted","Data":"effc94c84dbd7f83bc2e8687d44bac20ae0b39a32000e31a812219ed3b585168"} Oct 09 08:01:04 crc kubenswrapper[4715]: I1009 08:01:04.612813 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" Oct 09 08:01:04 crc kubenswrapper[4715]: I1009 08:01:04.629079 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" podStartSLOduration=3.477083246 podStartE2EDuration="38.629046925s" podCreationTimestamp="2025-10-09 08:00:26 +0000 UTC" firstStartedPulling="2025-10-09 08:00:28.875840821 +0000 UTC m=+859.568644829" lastFinishedPulling="2025-10-09 08:01:04.0278045 +0000 UTC m=+894.720608508" observedRunningTime="2025-10-09 08:01:04.628823948 +0000 UTC m=+895.321627976" watchObservedRunningTime="2025-10-09 08:01:04.629046925 +0000 UTC m=+895.321850963" Oct 09 08:01:07 crc kubenswrapper[4715]: I1009 08:01:07.146501 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-59578bc799-6zczp" Oct 09 08:01:07 crc kubenswrapper[4715]: I1009 08:01:07.312392 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-xmq4r" Oct 09 08:01:07 crc kubenswrapper[4715]: I1009 08:01:07.406073 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-664664cb68-xwt7q" Oct 09 08:01:07 crc kubenswrapper[4715]: I1009 08:01:07.435246 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-rcm5h" Oct 09 08:01:07 crc kubenswrapper[4715]: I1009 08:01:07.484697 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-rt6nt" Oct 09 08:01:08 crc kubenswrapper[4715]: I1009 08:01:08.844685 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll" Oct 09 08:01:17 crc kubenswrapper[4715]: I1009 08:01:17.261476 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-6xd27" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.690331 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-f6sks"] Oct 09 08:01:31 crc kubenswrapper[4715]: E1009 08:01:31.692308 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fefbe0c-de73-415c-a42f-77742a8afab2" containerName="extract-content" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.692327 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fefbe0c-de73-415c-a42f-77742a8afab2" containerName="extract-content" Oct 09 08:01:31 crc kubenswrapper[4715]: E1009 08:01:31.692351 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fefbe0c-de73-415c-a42f-77742a8afab2" containerName="extract-utilities" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.692359 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fefbe0c-de73-415c-a42f-77742a8afab2" containerName="extract-utilities" Oct 09 08:01:31 crc kubenswrapper[4715]: E1009 08:01:31.692383 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fefbe0c-de73-415c-a42f-77742a8afab2" containerName="registry-server" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.692393 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fefbe0c-de73-415c-a42f-77742a8afab2" containerName="registry-server" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.692612 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fefbe0c-de73-415c-a42f-77742a8afab2" containerName="registry-server" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.693449 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.695633 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.695889 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-2k8kn" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.696006 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.696178 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.714086 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-f6sks"] Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.736634 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5dq2\" (UniqueName: \"kubernetes.io/projected/7d910773-a4a7-4e14-99ae-4da6281bff35-kube-api-access-d5dq2\") pod \"dnsmasq-dns-675f4bcbfc-f6sks\" (UID: \"7d910773-a4a7-4e14-99ae-4da6281bff35\") " pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.736700 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d910773-a4a7-4e14-99ae-4da6281bff35-config\") pod \"dnsmasq-dns-675f4bcbfc-f6sks\" (UID: \"7d910773-a4a7-4e14-99ae-4da6281bff35\") " pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.793043 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dgflt"] Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.796799 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.805879 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.807307 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dgflt"] Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.842925 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-config\") pod \"dnsmasq-dns-78dd6ddcc-dgflt\" (UID: \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.842966 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cqzs\" (UniqueName: \"kubernetes.io/projected/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-kube-api-access-5cqzs\") pod \"dnsmasq-dns-78dd6ddcc-dgflt\" (UID: \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.842992 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5dq2\" (UniqueName: \"kubernetes.io/projected/7d910773-a4a7-4e14-99ae-4da6281bff35-kube-api-access-d5dq2\") pod \"dnsmasq-dns-675f4bcbfc-f6sks\" (UID: \"7d910773-a4a7-4e14-99ae-4da6281bff35\") " pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.843020 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d910773-a4a7-4e14-99ae-4da6281bff35-config\") pod \"dnsmasq-dns-675f4bcbfc-f6sks\" (UID: \"7d910773-a4a7-4e14-99ae-4da6281bff35\") " pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.843064 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-dgflt\" (UID: \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.844126 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d910773-a4a7-4e14-99ae-4da6281bff35-config\") pod \"dnsmasq-dns-675f4bcbfc-f6sks\" (UID: \"7d910773-a4a7-4e14-99ae-4da6281bff35\") " pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.893411 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5dq2\" (UniqueName: \"kubernetes.io/projected/7d910773-a4a7-4e14-99ae-4da6281bff35-kube-api-access-d5dq2\") pod \"dnsmasq-dns-675f4bcbfc-f6sks\" (UID: \"7d910773-a4a7-4e14-99ae-4da6281bff35\") " pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.944009 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-config\") pod \"dnsmasq-dns-78dd6ddcc-dgflt\" (UID: \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.944398 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cqzs\" (UniqueName: \"kubernetes.io/projected/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-kube-api-access-5cqzs\") pod \"dnsmasq-dns-78dd6ddcc-dgflt\" (UID: \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.944780 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-config\") pod \"dnsmasq-dns-78dd6ddcc-dgflt\" (UID: \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.946138 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-dgflt\" (UID: \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.947756 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-dgflt\" (UID: \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:31 crc kubenswrapper[4715]: I1009 08:01:31.969682 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cqzs\" (UniqueName: \"kubernetes.io/projected/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-kube-api-access-5cqzs\") pod \"dnsmasq-dns-78dd6ddcc-dgflt\" (UID: \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\") " pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:32 crc kubenswrapper[4715]: I1009 08:01:32.036818 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" Oct 09 08:01:32 crc kubenswrapper[4715]: I1009 08:01:32.127820 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:32 crc kubenswrapper[4715]: I1009 08:01:32.491653 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-f6sks"] Oct 09 08:01:32 crc kubenswrapper[4715]: I1009 08:01:32.496819 4715 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 08:01:32 crc kubenswrapper[4715]: I1009 08:01:32.599454 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dgflt"] Oct 09 08:01:32 crc kubenswrapper[4715]: W1009 08:01:32.602298 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ae4d644_5199_4e6e_82c9_b1d784ee6b3d.slice/crio-bd4aa45cdea90b0407652f0376e5ab6ad831090e6bd16b28a406685edda31148 WatchSource:0}: Error finding container bd4aa45cdea90b0407652f0376e5ab6ad831090e6bd16b28a406685edda31148: Status 404 returned error can't find the container with id bd4aa45cdea90b0407652f0376e5ab6ad831090e6bd16b28a406685edda31148 Oct 09 08:01:32 crc kubenswrapper[4715]: I1009 08:01:32.859831 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" event={"ID":"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d","Type":"ContainerStarted","Data":"bd4aa45cdea90b0407652f0376e5ab6ad831090e6bd16b28a406685edda31148"} Oct 09 08:01:32 crc kubenswrapper[4715]: I1009 08:01:32.861325 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" event={"ID":"7d910773-a4a7-4e14-99ae-4da6281bff35","Type":"ContainerStarted","Data":"336181be3ea3962cb516fb033bc8546134eb69a0995e9ca9b13fed4aacf3cdf7"} Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.140835 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-f6sks"] Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.219689 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-x5868"] Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.229251 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.249086 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-x5868"] Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.311456 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rwxv\" (UniqueName: \"kubernetes.io/projected/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-kube-api-access-6rwxv\") pod \"dnsmasq-dns-666b6646f7-x5868\" (UID: \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\") " pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.311555 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-x5868\" (UID: \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\") " pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.311628 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-config\") pod \"dnsmasq-dns-666b6646f7-x5868\" (UID: \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\") " pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.413398 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rwxv\" (UniqueName: \"kubernetes.io/projected/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-kube-api-access-6rwxv\") pod \"dnsmasq-dns-666b6646f7-x5868\" (UID: \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\") " pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.413513 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-x5868\" (UID: \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\") " pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.413599 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-config\") pod \"dnsmasq-dns-666b6646f7-x5868\" (UID: \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\") " pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.414741 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-config\") pod \"dnsmasq-dns-666b6646f7-x5868\" (UID: \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\") " pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.415557 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-x5868\" (UID: \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\") " pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.476121 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rwxv\" (UniqueName: \"kubernetes.io/projected/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-kube-api-access-6rwxv\") pod \"dnsmasq-dns-666b6646f7-x5868\" (UID: \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\") " pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.493573 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dgflt"] Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.515759 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rcs9s"] Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.520250 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.540637 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rcs9s"] Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.579217 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.617060 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msvll\" (UniqueName: \"kubernetes.io/projected/5c8600a7-d97c-4baa-9aec-1e3762af0e69-kube-api-access-msvll\") pod \"dnsmasq-dns-57d769cc4f-rcs9s\" (UID: \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\") " pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.617475 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c8600a7-d97c-4baa-9aec-1e3762af0e69-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rcs9s\" (UID: \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\") " pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.617524 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8600a7-d97c-4baa-9aec-1e3762af0e69-config\") pod \"dnsmasq-dns-57d769cc4f-rcs9s\" (UID: \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\") " pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.718457 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msvll\" (UniqueName: \"kubernetes.io/projected/5c8600a7-d97c-4baa-9aec-1e3762af0e69-kube-api-access-msvll\") pod \"dnsmasq-dns-57d769cc4f-rcs9s\" (UID: \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\") " pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.718828 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c8600a7-d97c-4baa-9aec-1e3762af0e69-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rcs9s\" (UID: \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\") " pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.718887 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8600a7-d97c-4baa-9aec-1e3762af0e69-config\") pod \"dnsmasq-dns-57d769cc4f-rcs9s\" (UID: \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\") " pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.720072 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8600a7-d97c-4baa-9aec-1e3762af0e69-config\") pod \"dnsmasq-dns-57d769cc4f-rcs9s\" (UID: \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\") " pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.720622 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c8600a7-d97c-4baa-9aec-1e3762af0e69-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rcs9s\" (UID: \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\") " pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.761337 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msvll\" (UniqueName: \"kubernetes.io/projected/5c8600a7-d97c-4baa-9aec-1e3762af0e69-kube-api-access-msvll\") pod \"dnsmasq-dns-57d769cc4f-rcs9s\" (UID: \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\") " pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:35 crc kubenswrapper[4715]: I1009 08:01:35.844733 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.069954 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-x5868"] Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.197192 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rcs9s"] Oct 09 08:01:36 crc kubenswrapper[4715]: W1009 08:01:36.226531 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c8600a7_d97c_4baa_9aec_1e3762af0e69.slice/crio-99a7b0c5116c3d6b1e2a94dd43561db276080844baddfe3175323fafdf04dc7f WatchSource:0}: Error finding container 99a7b0c5116c3d6b1e2a94dd43561db276080844baddfe3175323fafdf04dc7f: Status 404 returned error can't find the container with id 99a7b0c5116c3d6b1e2a94dd43561db276080844baddfe3175323fafdf04dc7f Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.356521 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.357944 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.360020 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.360068 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.361392 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.361761 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.361927 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.362105 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-pghpr" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.364273 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.383237 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.436636 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.436729 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.436763 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1673772c-a772-4ad8-85c3-f68268965d4b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.436789 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1673772c-a772-4ad8-85c3-f68268965d4b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.436840 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9dzr\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-kube-api-access-w9dzr\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.436862 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.436895 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.436949 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.436976 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-config-data\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.437011 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.437040 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.538619 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.538776 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9dzr\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-kube-api-access-w9dzr\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.538850 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.538870 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.538899 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-config-data\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.538955 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.538962 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.538988 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.539011 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.539096 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.539132 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1673772c-a772-4ad8-85c3-f68268965d4b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.539156 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1673772c-a772-4ad8-85c3-f68268965d4b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.540562 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.545392 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-config-data\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.545585 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.546011 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1673772c-a772-4ad8-85c3-f68268965d4b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.552380 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.552789 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.564062 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.565577 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.566029 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1673772c-a772-4ad8-85c3-f68268965d4b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.566271 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.568797 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9dzr\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-kube-api-access-w9dzr\") pod \"rabbitmq-server-0\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.661526 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.669258 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.674090 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.677835 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.678105 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.678287 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.679941 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.680284 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9x8lg" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.681164 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.681336 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.685303 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.741364 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a4714af0-14ef-4513-ac5e-dbf4aa99079b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.741474 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.741517 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.741537 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a4714af0-14ef-4513-ac5e-dbf4aa99079b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.741926 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.742001 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.742040 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.742059 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.742081 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.742117 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcmtl\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-kube-api-access-dcmtl\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.742149 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.845676 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.845761 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.845780 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a4714af0-14ef-4513-ac5e-dbf4aa99079b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.845862 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.845919 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.845948 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.845988 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.846010 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.846057 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcmtl\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-kube-api-access-dcmtl\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.846090 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.846193 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a4714af0-14ef-4513-ac5e-dbf4aa99079b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.846922 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.847839 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.847977 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.848485 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.848704 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.850127 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.855307 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a4714af0-14ef-4513-ac5e-dbf4aa99079b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.856833 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a4714af0-14ef-4513-ac5e-dbf4aa99079b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.858925 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.863300 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.876815 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcmtl\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-kube-api-access-dcmtl\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.900251 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-x5868" event={"ID":"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c","Type":"ContainerStarted","Data":"9d7ee5b11abe9118b258a79d1838d54da3ae95534c42d9fe75ca62ea636909bd"} Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.901470 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:36 crc kubenswrapper[4715]: I1009 08:01:36.903448 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" event={"ID":"5c8600a7-d97c-4baa-9aec-1e3762af0e69","Type":"ContainerStarted","Data":"99a7b0c5116c3d6b1e2a94dd43561db276080844baddfe3175323fafdf04dc7f"} Oct 09 08:01:37 crc kubenswrapper[4715]: I1009 08:01:37.006614 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.108127 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.109776 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.114695 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-649hb" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.114910 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.129508 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.131276 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.134158 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.138945 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.180579 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/895dedfd-5a74-43a4-81d1-6365aa67ed6a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.180641 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/895dedfd-5a74-43a4-81d1-6365aa67ed6a-kolla-config\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.180667 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895dedfd-5a74-43a4-81d1-6365aa67ed6a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.180724 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.180769 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/895dedfd-5a74-43a4-81d1-6365aa67ed6a-config-data-default\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.180793 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/895dedfd-5a74-43a4-81d1-6365aa67ed6a-secrets\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.180818 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/895dedfd-5a74-43a4-81d1-6365aa67ed6a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.180875 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/895dedfd-5a74-43a4-81d1-6365aa67ed6a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.181076 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxkfg\" (UniqueName: \"kubernetes.io/projected/895dedfd-5a74-43a4-81d1-6365aa67ed6a-kube-api-access-lxkfg\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.183016 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.282488 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/895dedfd-5a74-43a4-81d1-6365aa67ed6a-config-data-default\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.282542 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/895dedfd-5a74-43a4-81d1-6365aa67ed6a-secrets\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.282565 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/895dedfd-5a74-43a4-81d1-6365aa67ed6a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.282606 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/895dedfd-5a74-43a4-81d1-6365aa67ed6a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.282667 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxkfg\" (UniqueName: \"kubernetes.io/projected/895dedfd-5a74-43a4-81d1-6365aa67ed6a-kube-api-access-lxkfg\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.282721 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/895dedfd-5a74-43a4-81d1-6365aa67ed6a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.282741 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/895dedfd-5a74-43a4-81d1-6365aa67ed6a-kolla-config\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.282757 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895dedfd-5a74-43a4-81d1-6365aa67ed6a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.282792 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.283010 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.283954 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/895dedfd-5a74-43a4-81d1-6365aa67ed6a-kolla-config\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.284325 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/895dedfd-5a74-43a4-81d1-6365aa67ed6a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.284817 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/895dedfd-5a74-43a4-81d1-6365aa67ed6a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.285664 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/895dedfd-5a74-43a4-81d1-6365aa67ed6a-config-data-default\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.287891 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/895dedfd-5a74-43a4-81d1-6365aa67ed6a-secrets\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.289889 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/895dedfd-5a74-43a4-81d1-6365aa67ed6a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.292326 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895dedfd-5a74-43a4-81d1-6365aa67ed6a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.312056 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxkfg\" (UniqueName: \"kubernetes.io/projected/895dedfd-5a74-43a4-81d1-6365aa67ed6a-kube-api-access-lxkfg\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.327867 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"895dedfd-5a74-43a4-81d1-6365aa67ed6a\") " pod="openstack/openstack-galera-0" Oct 09 08:01:38 crc kubenswrapper[4715]: I1009 08:01:38.447258 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.176274 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.177741 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.183935 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.184225 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.184312 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-z9vxd" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.184804 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.197488 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.302125 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.302188 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.302240 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.302267 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.302296 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.302323 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.302547 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.302571 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwrx7\" (UniqueName: \"kubernetes.io/projected/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-kube-api-access-lwrx7\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.302597 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.403986 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.404058 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwrx7\" (UniqueName: \"kubernetes.io/projected/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-kube-api-access-lwrx7\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.404549 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.404701 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.404751 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.404802 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.404830 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.404867 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.404911 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.405497 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.405570 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.405977 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.406658 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.407168 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.409452 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.410098 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.415120 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.425874 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.431405 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwrx7\" (UniqueName: \"kubernetes.io/projected/72f969cd-b504-4db1-832a-1e0c7f0a3b7b-kube-api-access-lwrx7\") pod \"openstack-cell1-galera-0\" (UID: \"72f969cd-b504-4db1-832a-1e0c7f0a3b7b\") " pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.513724 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.716621 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.717973 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.729458 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.731808 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fdthp" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.734251 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.750026 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.825677 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f0338-4450-49ca-ad02-67cdda5d323f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.825745 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d9f0338-4450-49ca-ad02-67cdda5d323f-kolla-config\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.825804 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhrhb\" (UniqueName: \"kubernetes.io/projected/5d9f0338-4450-49ca-ad02-67cdda5d323f-kube-api-access-zhrhb\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.825867 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d9f0338-4450-49ca-ad02-67cdda5d323f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.825894 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d9f0338-4450-49ca-ad02-67cdda5d323f-config-data\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.927652 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f0338-4450-49ca-ad02-67cdda5d323f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.927707 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d9f0338-4450-49ca-ad02-67cdda5d323f-kolla-config\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.927736 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhrhb\" (UniqueName: \"kubernetes.io/projected/5d9f0338-4450-49ca-ad02-67cdda5d323f-kube-api-access-zhrhb\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.927787 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d9f0338-4450-49ca-ad02-67cdda5d323f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.927805 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d9f0338-4450-49ca-ad02-67cdda5d323f-config-data\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.928965 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d9f0338-4450-49ca-ad02-67cdda5d323f-kolla-config\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.928982 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d9f0338-4450-49ca-ad02-67cdda5d323f-config-data\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.951288 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f0338-4450-49ca-ad02-67cdda5d323f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.952361 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d9f0338-4450-49ca-ad02-67cdda5d323f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:39 crc kubenswrapper[4715]: I1009 08:01:39.985376 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhrhb\" (UniqueName: \"kubernetes.io/projected/5d9f0338-4450-49ca-ad02-67cdda5d323f-kube-api-access-zhrhb\") pod \"memcached-0\" (UID: \"5d9f0338-4450-49ca-ad02-67cdda5d323f\") " pod="openstack/memcached-0" Oct 09 08:01:40 crc kubenswrapper[4715]: I1009 08:01:40.054559 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 09 08:01:41 crc kubenswrapper[4715]: I1009 08:01:41.392118 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 08:01:41 crc kubenswrapper[4715]: I1009 08:01:41.393403 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 08:01:41 crc kubenswrapper[4715]: I1009 08:01:41.395377 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-tdlrx" Oct 09 08:01:41 crc kubenswrapper[4715]: I1009 08:01:41.399585 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 08:01:41 crc kubenswrapper[4715]: I1009 08:01:41.563263 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7f7t\" (UniqueName: \"kubernetes.io/projected/c264ae36-c2f5-46f7-ae4f-9d4464b2a57f-kube-api-access-r7f7t\") pod \"kube-state-metrics-0\" (UID: \"c264ae36-c2f5-46f7-ae4f-9d4464b2a57f\") " pod="openstack/kube-state-metrics-0" Oct 09 08:01:41 crc kubenswrapper[4715]: I1009 08:01:41.664850 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7f7t\" (UniqueName: \"kubernetes.io/projected/c264ae36-c2f5-46f7-ae4f-9d4464b2a57f-kube-api-access-r7f7t\") pod \"kube-state-metrics-0\" (UID: \"c264ae36-c2f5-46f7-ae4f-9d4464b2a57f\") " pod="openstack/kube-state-metrics-0" Oct 09 08:01:41 crc kubenswrapper[4715]: I1009 08:01:41.682046 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7f7t\" (UniqueName: \"kubernetes.io/projected/c264ae36-c2f5-46f7-ae4f-9d4464b2a57f-kube-api-access-r7f7t\") pod \"kube-state-metrics-0\" (UID: \"c264ae36-c2f5-46f7-ae4f-9d4464b2a57f\") " pod="openstack/kube-state-metrics-0" Oct 09 08:01:41 crc kubenswrapper[4715]: I1009 08:01:41.713945 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.415306 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xfr2w"] Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.417089 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.418977 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.419896 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-p76qk" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.420170 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.426086 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xfr2w"] Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.465705 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-scripts\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.465785 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-var-run\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.465813 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-var-log-ovn\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.465833 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-combined-ca-bundle\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.465868 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpwvv\" (UniqueName: \"kubernetes.io/projected/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-kube-api-access-kpwvv\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.465944 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-ovn-controller-tls-certs\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.465967 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-var-run-ovn\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.469488 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-gckmt"] Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.471348 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.495341 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gckmt"] Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.568545 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpwvv\" (UniqueName: \"kubernetes.io/projected/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-kube-api-access-kpwvv\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.568635 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2db0914e-e011-4b76-a07d-57ce73faceaa-var-log\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.568682 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2db0914e-e011-4b76-a07d-57ce73faceaa-scripts\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.568782 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqc5c\" (UniqueName: \"kubernetes.io/projected/2db0914e-e011-4b76-a07d-57ce73faceaa-kube-api-access-mqc5c\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.568822 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-ovn-controller-tls-certs\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.568862 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-var-run-ovn\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.568881 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-scripts\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.568906 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2db0914e-e011-4b76-a07d-57ce73faceaa-var-lib\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.568933 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2db0914e-e011-4b76-a07d-57ce73faceaa-var-run\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.568954 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-var-run\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.568976 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-var-log-ovn\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.568993 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-combined-ca-bundle\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.569015 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2db0914e-e011-4b76-a07d-57ce73faceaa-etc-ovs\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.569393 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-var-run-ovn\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.569591 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-var-log-ovn\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.570769 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-var-run\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.571403 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-scripts\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.575988 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-combined-ca-bundle\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.582405 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-ovn-controller-tls-certs\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.584393 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpwvv\" (UniqueName: \"kubernetes.io/projected/2a1f06ac-c4c8-4884-bb1a-360fbaf03adf-kube-api-access-kpwvv\") pod \"ovn-controller-xfr2w\" (UID: \"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf\") " pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.673102 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2db0914e-e011-4b76-a07d-57ce73faceaa-var-run\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.673241 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2db0914e-e011-4b76-a07d-57ce73faceaa-var-run\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.673250 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2db0914e-e011-4b76-a07d-57ce73faceaa-etc-ovs\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.673382 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2db0914e-e011-4b76-a07d-57ce73faceaa-var-log\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.673485 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2db0914e-e011-4b76-a07d-57ce73faceaa-scripts\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.673546 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2db0914e-e011-4b76-a07d-57ce73faceaa-etc-ovs\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.673573 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqc5c\" (UniqueName: \"kubernetes.io/projected/2db0914e-e011-4b76-a07d-57ce73faceaa-kube-api-access-mqc5c\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.673630 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2db0914e-e011-4b76-a07d-57ce73faceaa-var-log\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.673668 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2db0914e-e011-4b76-a07d-57ce73faceaa-var-lib\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.673841 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2db0914e-e011-4b76-a07d-57ce73faceaa-var-lib\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.675958 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2db0914e-e011-4b76-a07d-57ce73faceaa-scripts\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.694249 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqc5c\" (UniqueName: \"kubernetes.io/projected/2db0914e-e011-4b76-a07d-57ce73faceaa-kube-api-access-mqc5c\") pod \"ovn-controller-ovs-gckmt\" (UID: \"2db0914e-e011-4b76-a07d-57ce73faceaa\") " pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.745456 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:45 crc kubenswrapper[4715]: I1009 08:01:45.792305 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.851534 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.853087 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.859932 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-ltj5c" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.860212 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.866507 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.874592 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.875069 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.875212 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.917984 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m2ql\" (UniqueName: \"kubernetes.io/projected/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-kube-api-access-7m2ql\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.918063 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.918198 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.918321 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.918377 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.918480 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-config\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.918536 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:47 crc kubenswrapper[4715]: I1009 08:01:47.918594 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.019830 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-config\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.019904 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.019929 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.019961 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m2ql\" (UniqueName: \"kubernetes.io/projected/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-kube-api-access-7m2ql\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.019986 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.020006 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.020032 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.020056 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.020363 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.020568 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-config\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.020729 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.022183 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.028085 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.031555 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.035275 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.038910 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m2ql\" (UniqueName: \"kubernetes.io/projected/ec6c7ac4-3535-4d45-9be1-8f6b4de9670f-kube-api-access-7m2ql\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.050525 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f\") " pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.081860 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.083579 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.093093 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.093142 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-28btj" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.093660 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.095003 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.095332 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.120868 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc229de4-d184-450e-805b-a8b616c8a60b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.120919 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc229de4-d184-450e-805b-a8b616c8a60b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.120944 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc229de4-d184-450e-805b-a8b616c8a60b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.120975 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc229de4-d184-450e-805b-a8b616c8a60b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.121006 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc229de4-d184-450e-805b-a8b616c8a60b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.121030 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.121056 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wzwx\" (UniqueName: \"kubernetes.io/projected/dc229de4-d184-450e-805b-a8b616c8a60b-kube-api-access-4wzwx\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.121084 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc229de4-d184-450e-805b-a8b616c8a60b-config\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.187227 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.222312 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc229de4-d184-450e-805b-a8b616c8a60b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.222467 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc229de4-d184-450e-805b-a8b616c8a60b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.222513 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc229de4-d184-450e-805b-a8b616c8a60b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.222552 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc229de4-d184-450e-805b-a8b616c8a60b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.222584 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.222611 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wzwx\" (UniqueName: \"kubernetes.io/projected/dc229de4-d184-450e-805b-a8b616c8a60b-kube-api-access-4wzwx\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.222640 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc229de4-d184-450e-805b-a8b616c8a60b-config\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.222660 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc229de4-d184-450e-805b-a8b616c8a60b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.222899 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc229de4-d184-450e-805b-a8b616c8a60b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.226577 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.227348 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc229de4-d184-450e-805b-a8b616c8a60b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.228458 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc229de4-d184-450e-805b-a8b616c8a60b-config\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.229182 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc229de4-d184-450e-805b-a8b616c8a60b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.229453 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc229de4-d184-450e-805b-a8b616c8a60b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.229634 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc229de4-d184-450e-805b-a8b616c8a60b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.250299 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.253435 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wzwx\" (UniqueName: \"kubernetes.io/projected/dc229de4-d184-450e-805b-a8b616c8a60b-kube-api-access-4wzwx\") pod \"ovsdbserver-sb-0\" (UID: \"dc229de4-d184-450e-805b-a8b616c8a60b\") " pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: E1009 08:01:48.410856 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 09 08:01:48 crc kubenswrapper[4715]: E1009 08:01:48.411075 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d5dq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-f6sks_openstack(7d910773-a4a7-4e14-99ae-4da6281bff35): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:01:48 crc kubenswrapper[4715]: E1009 08:01:48.412359 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" podUID="7d910773-a4a7-4e14-99ae-4da6281bff35" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.415945 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 09 08:01:48 crc kubenswrapper[4715]: E1009 08:01:48.483066 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 09 08:01:48 crc kubenswrapper[4715]: E1009 08:01:48.483360 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5cqzs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-dgflt_openstack(7ae4d644-5199-4e6e-82c9-b1d784ee6b3d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:01:48 crc kubenswrapper[4715]: E1009 08:01:48.484607 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" podUID="7ae4d644-5199-4e6e-82c9-b1d784ee6b3d" Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.965534 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 08:01:48 crc kubenswrapper[4715]: I1009 08:01:48.971041 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.025890 4715 generic.go:334] "Generic (PLEG): container finished" podID="a71e26eb-6d9c-40d9-8960-a6d1ba76f08c" containerID="3e6699d58e41830c40e0778b40f55cccf49c30a46be0f6346801a714282dd5d1" exitCode=0 Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.025952 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-x5868" event={"ID":"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c","Type":"ContainerDied","Data":"3e6699d58e41830c40e0778b40f55cccf49c30a46be0f6346801a714282dd5d1"} Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.031126 4715 generic.go:334] "Generic (PLEG): container finished" podID="5c8600a7-d97c-4baa-9aec-1e3762af0e69" containerID="85d7225914fd7282cda1683bb3a64aab666706db7d7f3dd28c9da8d576346693" exitCode=0 Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.031460 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" event={"ID":"5c8600a7-d97c-4baa-9aec-1e3762af0e69","Type":"ContainerDied","Data":"85d7225914fd7282cda1683bb3a64aab666706db7d7f3dd28c9da8d576346693"} Oct 09 08:01:49 crc kubenswrapper[4715]: W1009 08:01:49.035880 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1673772c_a772_4ad8_85c3_f68268965d4b.slice/crio-addb4771dd9794d781c5dd544b89a7b8ca72472906966daf2fce113e5102679f WatchSource:0}: Error finding container addb4771dd9794d781c5dd544b89a7b8ca72472906966daf2fce113e5102679f: Status 404 returned error can't find the container with id addb4771dd9794d781c5dd544b89a7b8ca72472906966daf2fce113e5102679f Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.150360 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 09 08:01:49 crc kubenswrapper[4715]: W1009 08:01:49.157181 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4714af0_14ef_4513_ac5e_dbf4aa99079b.slice/crio-7acd953faef5f022130324c5e0cc1ba1395483711566f6744e2ae956d91a44c7 WatchSource:0}: Error finding container 7acd953faef5f022130324c5e0cc1ba1395483711566f6744e2ae956d91a44c7: Status 404 returned error can't find the container with id 7acd953faef5f022130324c5e0cc1ba1395483711566f6744e2ae956d91a44c7 Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.160745 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.430256 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xfr2w"] Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.452801 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.458992 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 09 08:01:49 crc kubenswrapper[4715]: W1009 08:01:49.462992 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d9f0338_4450_49ca_ad02_67cdda5d323f.slice/crio-84a84d59aaa5f8379287b1dc415ceddf3a26c7852a27d19de448c1bc382b165f WatchSource:0}: Error finding container 84a84d59aaa5f8379287b1dc415ceddf3a26c7852a27d19de448c1bc382b165f: Status 404 returned error can't find the container with id 84a84d59aaa5f8379287b1dc415ceddf3a26c7852a27d19de448c1bc382b165f Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.486169 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.499780 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.556834 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-dns-svc\") pod \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\" (UID: \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\") " Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.557015 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-config\") pod \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\" (UID: \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\") " Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.557041 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5dq2\" (UniqueName: \"kubernetes.io/projected/7d910773-a4a7-4e14-99ae-4da6281bff35-kube-api-access-d5dq2\") pod \"7d910773-a4a7-4e14-99ae-4da6281bff35\" (UID: \"7d910773-a4a7-4e14-99ae-4da6281bff35\") " Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.557074 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cqzs\" (UniqueName: \"kubernetes.io/projected/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-kube-api-access-5cqzs\") pod \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\" (UID: \"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d\") " Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.557107 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d910773-a4a7-4e14-99ae-4da6281bff35-config\") pod \"7d910773-a4a7-4e14-99ae-4da6281bff35\" (UID: \"7d910773-a4a7-4e14-99ae-4da6281bff35\") " Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.557875 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-config" (OuterVolumeSpecName: "config") pod "7ae4d644-5199-4e6e-82c9-b1d784ee6b3d" (UID: "7ae4d644-5199-4e6e-82c9-b1d784ee6b3d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.558349 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ae4d644-5199-4e6e-82c9-b1d784ee6b3d" (UID: "7ae4d644-5199-4e6e-82c9-b1d784ee6b3d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.558369 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d910773-a4a7-4e14-99ae-4da6281bff35-config" (OuterVolumeSpecName: "config") pod "7d910773-a4a7-4e14-99ae-4da6281bff35" (UID: "7d910773-a4a7-4e14-99ae-4da6281bff35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.563527 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-kube-api-access-5cqzs" (OuterVolumeSpecName: "kube-api-access-5cqzs") pod "7ae4d644-5199-4e6e-82c9-b1d784ee6b3d" (UID: "7ae4d644-5199-4e6e-82c9-b1d784ee6b3d"). InnerVolumeSpecName "kube-api-access-5cqzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.570629 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d910773-a4a7-4e14-99ae-4da6281bff35-kube-api-access-d5dq2" (OuterVolumeSpecName: "kube-api-access-d5dq2") pod "7d910773-a4a7-4e14-99ae-4da6281bff35" (UID: "7d910773-a4a7-4e14-99ae-4da6281bff35"). InnerVolumeSpecName "kube-api-access-d5dq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.575932 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.631530 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 09 08:01:49 crc kubenswrapper[4715]: W1009 08:01:49.632521 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc229de4_d184_450e_805b_a8b616c8a60b.slice/crio-dae77dbc1329c722dd18212b50bc6c116b63c4941f62773a1ec33fa7292299c0 WatchSource:0}: Error finding container dae77dbc1329c722dd18212b50bc6c116b63c4941f62773a1ec33fa7292299c0: Status 404 returned error can't find the container with id dae77dbc1329c722dd18212b50bc6c116b63c4941f62773a1ec33fa7292299c0 Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.659217 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.659378 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5dq2\" (UniqueName: \"kubernetes.io/projected/7d910773-a4a7-4e14-99ae-4da6281bff35-kube-api-access-d5dq2\") on node \"crc\" DevicePath \"\"" Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.659486 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cqzs\" (UniqueName: \"kubernetes.io/projected/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-kube-api-access-5cqzs\") on node \"crc\" DevicePath \"\"" Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.659551 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d910773-a4a7-4e14-99ae-4da6281bff35-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:01:49 crc kubenswrapper[4715]: I1009 08:01:49.659624 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.041403 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" event={"ID":"5c8600a7-d97c-4baa-9aec-1e3762af0e69","Type":"ContainerStarted","Data":"6a23038758906d162356892a84a69088c8648643b34167e4039038ec52f25b6b"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.041555 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.043499 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xfr2w" event={"ID":"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf","Type":"ContainerStarted","Data":"ba1423b9c62381b9bfaad3a121fbf7cbdcd34e0a89ff1be40379cab95822dabb"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.045694 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c264ae36-c2f5-46f7-ae4f-9d4464b2a57f","Type":"ContainerStarted","Data":"6ce20347f60d0fe26642e9d88b4c04f3dad04b1a0dd9e7db258e360a5ac59907"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.054519 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"72f969cd-b504-4db1-832a-1e0c7f0a3b7b","Type":"ContainerStarted","Data":"6216f8b7714d7f129dcde96fa12f67b5c0b186c438508a900b8c0051b500c735"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.055832 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"895dedfd-5a74-43a4-81d1-6365aa67ed6a","Type":"ContainerStarted","Data":"8be9ded4671f3eccf439c2439939bac6145922db271b17d02325176905fd15ba"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.057003 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dc229de4-d184-450e-805b-a8b616c8a60b","Type":"ContainerStarted","Data":"dae77dbc1329c722dd18212b50bc6c116b63c4941f62773a1ec33fa7292299c0"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.058531 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" event={"ID":"7d910773-a4a7-4e14-99ae-4da6281bff35","Type":"ContainerDied","Data":"336181be3ea3962cb516fb033bc8546134eb69a0995e9ca9b13fed4aacf3cdf7"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.058657 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-f6sks" Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.067589 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" podStartSLOduration=2.721635301 podStartE2EDuration="15.06756526s" podCreationTimestamp="2025-10-09 08:01:35 +0000 UTC" firstStartedPulling="2025-10-09 08:01:36.242339828 +0000 UTC m=+926.935143836" lastFinishedPulling="2025-10-09 08:01:48.588269787 +0000 UTC m=+939.281073795" observedRunningTime="2025-10-09 08:01:50.06239694 +0000 UTC m=+940.755200948" watchObservedRunningTime="2025-10-09 08:01:50.06756526 +0000 UTC m=+940.760369268" Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.068906 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f","Type":"ContainerStarted","Data":"7ce3a9125038c093e616d72ccfa07921bbdd38b2790aaf69a373777d21b3352f"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.071350 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a4714af0-14ef-4513-ac5e-dbf4aa99079b","Type":"ContainerStarted","Data":"7acd953faef5f022130324c5e0cc1ba1395483711566f6744e2ae956d91a44c7"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.073166 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5d9f0338-4450-49ca-ad02-67cdda5d323f","Type":"ContainerStarted","Data":"84a84d59aaa5f8379287b1dc415ceddf3a26c7852a27d19de448c1bc382b165f"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.076372 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-x5868" event={"ID":"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c","Type":"ContainerStarted","Data":"092db75bf40d237590ffe6a870c3d83a7a37215fc7d6f673f1271ff3d516e72c"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.079844 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1673772c-a772-4ad8-85c3-f68268965d4b","Type":"ContainerStarted","Data":"addb4771dd9794d781c5dd544b89a7b8ca72472906966daf2fce113e5102679f"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.081104 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" event={"ID":"7ae4d644-5199-4e6e-82c9-b1d784ee6b3d","Type":"ContainerDied","Data":"bd4aa45cdea90b0407652f0376e5ab6ad831090e6bd16b28a406685edda31148"} Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.081141 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-dgflt" Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.093461 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-x5868" podStartSLOduration=2.622591882 podStartE2EDuration="15.093404344s" podCreationTimestamp="2025-10-09 08:01:35 +0000 UTC" firstStartedPulling="2025-10-09 08:01:36.090144589 +0000 UTC m=+926.782948597" lastFinishedPulling="2025-10-09 08:01:48.560957051 +0000 UTC m=+939.253761059" observedRunningTime="2025-10-09 08:01:50.092155208 +0000 UTC m=+940.784959216" watchObservedRunningTime="2025-10-09 08:01:50.093404344 +0000 UTC m=+940.786208352" Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.131826 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-f6sks"] Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.171124 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-f6sks"] Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.171167 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dgflt"] Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.179552 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-dgflt"] Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.579281 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:50 crc kubenswrapper[4715]: I1009 08:01:50.623713 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gckmt"] Oct 09 08:01:50 crc kubenswrapper[4715]: W1009 08:01:50.708816 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2db0914e_e011_4b76_a07d_57ce73faceaa.slice/crio-ea70fb109f52f6279e50c0d2c2fa6770c2ddb5b04b90a9f493a64a92cacdf6b5 WatchSource:0}: Error finding container ea70fb109f52f6279e50c0d2c2fa6770c2ddb5b04b90a9f493a64a92cacdf6b5: Status 404 returned error can't find the container with id ea70fb109f52f6279e50c0d2c2fa6770c2ddb5b04b90a9f493a64a92cacdf6b5 Oct 09 08:01:51 crc kubenswrapper[4715]: I1009 08:01:51.097055 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gckmt" event={"ID":"2db0914e-e011-4b76-a07d-57ce73faceaa","Type":"ContainerStarted","Data":"ea70fb109f52f6279e50c0d2c2fa6770c2ddb5b04b90a9f493a64a92cacdf6b5"} Oct 09 08:01:52 crc kubenswrapper[4715]: I1009 08:01:52.145820 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae4d644-5199-4e6e-82c9-b1d784ee6b3d" path="/var/lib/kubelet/pods/7ae4d644-5199-4e6e-82c9-b1d784ee6b3d/volumes" Oct 09 08:01:52 crc kubenswrapper[4715]: I1009 08:01:52.146561 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d910773-a4a7-4e14-99ae-4da6281bff35" path="/var/lib/kubelet/pods/7d910773-a4a7-4e14-99ae-4da6281bff35/volumes" Oct 09 08:01:55 crc kubenswrapper[4715]: I1009 08:01:55.581113 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:55 crc kubenswrapper[4715]: I1009 08:01:55.846658 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:01:55 crc kubenswrapper[4715]: I1009 08:01:55.899057 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-x5868"] Oct 09 08:01:56 crc kubenswrapper[4715]: I1009 08:01:56.131322 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-x5868" podUID="a71e26eb-6d9c-40d9-8960-a6d1ba76f08c" containerName="dnsmasq-dns" containerID="cri-o://092db75bf40d237590ffe6a870c3d83a7a37215fc7d6f673f1271ff3d516e72c" gracePeriod=10 Oct 09 08:01:57 crc kubenswrapper[4715]: I1009 08:01:57.142603 4715 generic.go:334] "Generic (PLEG): container finished" podID="a71e26eb-6d9c-40d9-8960-a6d1ba76f08c" containerID="092db75bf40d237590ffe6a870c3d83a7a37215fc7d6f673f1271ff3d516e72c" exitCode=0 Oct 09 08:01:57 crc kubenswrapper[4715]: I1009 08:01:57.142694 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-x5868" event={"ID":"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c","Type":"ContainerDied","Data":"092db75bf40d237590ffe6a870c3d83a7a37215fc7d6f673f1271ff3d516e72c"} Oct 09 08:01:57 crc kubenswrapper[4715]: I1009 08:01:57.165813 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:57 crc kubenswrapper[4715]: I1009 08:01:57.195266 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-config\") pod \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\" (UID: \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\") " Oct 09 08:01:57 crc kubenswrapper[4715]: I1009 08:01:57.195861 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rwxv\" (UniqueName: \"kubernetes.io/projected/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-kube-api-access-6rwxv\") pod \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\" (UID: \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\") " Oct 09 08:01:57 crc kubenswrapper[4715]: I1009 08:01:57.196033 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-dns-svc\") pod \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\" (UID: \"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c\") " Oct 09 08:01:57 crc kubenswrapper[4715]: I1009 08:01:57.220340 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-kube-api-access-6rwxv" (OuterVolumeSpecName: "kube-api-access-6rwxv") pod "a71e26eb-6d9c-40d9-8960-a6d1ba76f08c" (UID: "a71e26eb-6d9c-40d9-8960-a6d1ba76f08c"). InnerVolumeSpecName "kube-api-access-6rwxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:01:57 crc kubenswrapper[4715]: I1009 08:01:57.234988 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a71e26eb-6d9c-40d9-8960-a6d1ba76f08c" (UID: "a71e26eb-6d9c-40d9-8960-a6d1ba76f08c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:01:57 crc kubenswrapper[4715]: I1009 08:01:57.240481 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-config" (OuterVolumeSpecName: "config") pod "a71e26eb-6d9c-40d9-8960-a6d1ba76f08c" (UID: "a71e26eb-6d9c-40d9-8960-a6d1ba76f08c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:01:57 crc kubenswrapper[4715]: I1009 08:01:57.297242 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:01:57 crc kubenswrapper[4715]: I1009 08:01:57.297280 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rwxv\" (UniqueName: \"kubernetes.io/projected/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-kube-api-access-6rwxv\") on node \"crc\" DevicePath \"\"" Oct 09 08:01:57 crc kubenswrapper[4715]: I1009 08:01:57.297291 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.150316 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5d9f0338-4450-49ca-ad02-67cdda5d323f","Type":"ContainerStarted","Data":"7344d1f8925878e8821910757ea07f8e0b83e49b9704595a2ac575d30582d2a7"} Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.151663 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.157395 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-x5868" Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.157443 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-x5868" event={"ID":"a71e26eb-6d9c-40d9-8960-a6d1ba76f08c","Type":"ContainerDied","Data":"9d7ee5b11abe9118b258a79d1838d54da3ae95534c42d9fe75ca62ea636909bd"} Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.157527 4715 scope.go:117] "RemoveContainer" containerID="092db75bf40d237590ffe6a870c3d83a7a37215fc7d6f673f1271ff3d516e72c" Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.160479 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f","Type":"ContainerStarted","Data":"0e4511c3f83e7476f83225414ad76b3c4536a91c2ce5dfb59afc636e3df56f4e"} Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.163532 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"72f969cd-b504-4db1-832a-1e0c7f0a3b7b","Type":"ContainerStarted","Data":"c214359d0e45269ee92caf744ce8c673364bb0fb674a55d8e00cc730c463ca76"} Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.164920 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"895dedfd-5a74-43a4-81d1-6365aa67ed6a","Type":"ContainerStarted","Data":"eddf9f82b4c1de1166791c581356688f954d78066dca7908d1dc0a08cd8a4b5f"} Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.166846 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dc229de4-d184-450e-805b-a8b616c8a60b","Type":"ContainerStarted","Data":"340fa5b219566c177cb0cfe0648c92dca2a020baa67eba7d8f122bdb13a5c345"} Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.184797 4715 scope.go:117] "RemoveContainer" containerID="3e6699d58e41830c40e0778b40f55cccf49c30a46be0f6346801a714282dd5d1" Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.186934 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.452683516 podStartE2EDuration="19.186909275s" podCreationTimestamp="2025-10-09 08:01:39 +0000 UTC" firstStartedPulling="2025-10-09 08:01:49.468841999 +0000 UTC m=+940.161646007" lastFinishedPulling="2025-10-09 08:01:56.203067758 +0000 UTC m=+946.895871766" observedRunningTime="2025-10-09 08:01:58.172299519 +0000 UTC m=+948.865103527" watchObservedRunningTime="2025-10-09 08:01:58.186909275 +0000 UTC m=+948.879713283" Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.189273 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-x5868"] Oct 09 08:01:58 crc kubenswrapper[4715]: I1009 08:01:58.195921 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-x5868"] Oct 09 08:01:59 crc kubenswrapper[4715]: I1009 08:01:59.178093 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c264ae36-c2f5-46f7-ae4f-9d4464b2a57f","Type":"ContainerStarted","Data":"dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c"} Oct 09 08:01:59 crc kubenswrapper[4715]: I1009 08:01:59.178980 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 09 08:01:59 crc kubenswrapper[4715]: I1009 08:01:59.190361 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a4714af0-14ef-4513-ac5e-dbf4aa99079b","Type":"ContainerStarted","Data":"1bd128cd2c654c89d405e27fdb17e74880b73ef2fcd7b473522b9790ccdfeb29"} Oct 09 08:01:59 crc kubenswrapper[4715]: I1009 08:01:59.201729 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.258116081 podStartE2EDuration="18.201707431s" podCreationTimestamp="2025-10-09 08:01:41 +0000 UTC" firstStartedPulling="2025-10-09 08:01:49.484016031 +0000 UTC m=+940.176820039" lastFinishedPulling="2025-10-09 08:01:57.427607371 +0000 UTC m=+948.120411389" observedRunningTime="2025-10-09 08:01:59.19757099 +0000 UTC m=+949.890375008" watchObservedRunningTime="2025-10-09 08:01:59.201707431 +0000 UTC m=+949.894511439" Oct 09 08:01:59 crc kubenswrapper[4715]: I1009 08:01:59.207764 4715 generic.go:334] "Generic (PLEG): container finished" podID="2db0914e-e011-4b76-a07d-57ce73faceaa" containerID="3cf2cbfdbad983117cdf6e69e896d6881a80f0546f243d33ca4d710a0ecfa40a" exitCode=0 Oct 09 08:01:59 crc kubenswrapper[4715]: I1009 08:01:59.208705 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gckmt" event={"ID":"2db0914e-e011-4b76-a07d-57ce73faceaa","Type":"ContainerDied","Data":"3cf2cbfdbad983117cdf6e69e896d6881a80f0546f243d33ca4d710a0ecfa40a"} Oct 09 08:01:59 crc kubenswrapper[4715]: I1009 08:01:59.214236 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1673772c-a772-4ad8-85c3-f68268965d4b","Type":"ContainerStarted","Data":"75a982e1d44680c2b96f722a9924ebc5c281f5c8f11940251e6f50af37404454"} Oct 09 08:01:59 crc kubenswrapper[4715]: I1009 08:01:59.218248 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xfr2w" event={"ID":"2a1f06ac-c4c8-4884-bb1a-360fbaf03adf","Type":"ContainerStarted","Data":"49333e3d59b6fd9e60b2537e24291376d89401a1b5dbd0a869332dc59bf84e07"} Oct 09 08:01:59 crc kubenswrapper[4715]: I1009 08:01:59.218295 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-xfr2w" Oct 09 08:01:59 crc kubenswrapper[4715]: I1009 08:01:59.271110 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xfr2w" podStartSLOduration=6.586675244 podStartE2EDuration="14.271092994s" podCreationTimestamp="2025-10-09 08:01:45 +0000 UTC" firstStartedPulling="2025-10-09 08:01:49.46886358 +0000 UTC m=+940.161667588" lastFinishedPulling="2025-10-09 08:01:57.15328133 +0000 UTC m=+947.846085338" observedRunningTime="2025-10-09 08:01:59.268476438 +0000 UTC m=+949.961280446" watchObservedRunningTime="2025-10-09 08:01:59.271092994 +0000 UTC m=+949.963897002" Oct 09 08:02:00 crc kubenswrapper[4715]: I1009 08:02:00.146324 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a71e26eb-6d9c-40d9-8960-a6d1ba76f08c" path="/var/lib/kubelet/pods/a71e26eb-6d9c-40d9-8960-a6d1ba76f08c/volumes" Oct 09 08:02:00 crc kubenswrapper[4715]: I1009 08:02:00.228207 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gckmt" event={"ID":"2db0914e-e011-4b76-a07d-57ce73faceaa","Type":"ContainerStarted","Data":"ffe1ab9bd3424f25f4c92363f6a37f227dc7c1fae05a3f65ecdc66f9e11ba4ca"} Oct 09 08:02:01 crc kubenswrapper[4715]: I1009 08:02:01.240832 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dc229de4-d184-450e-805b-a8b616c8a60b","Type":"ContainerStarted","Data":"dc1728eb607f2ca3e01b9bf504787e68ff7a1fdd16f7ad9c2309ea0dfb95462d"} Oct 09 08:02:01 crc kubenswrapper[4715]: I1009 08:02:01.245482 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gckmt" event={"ID":"2db0914e-e011-4b76-a07d-57ce73faceaa","Type":"ContainerStarted","Data":"d6289733aad231613ff28e3cb1d3b1e504b333a4a4b231daddc9c0f158a92dfe"} Oct 09 08:02:01 crc kubenswrapper[4715]: I1009 08:02:01.245734 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:02:01 crc kubenswrapper[4715]: I1009 08:02:01.245869 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:02:01 crc kubenswrapper[4715]: I1009 08:02:01.259063 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ec6c7ac4-3535-4d45-9be1-8f6b4de9670f","Type":"ContainerStarted","Data":"26fa7d04dbc86a1f1b11ac84ea35904ec5c7f0fae7b59a2f7dc1a0fb97ad478e"} Oct 09 08:02:01 crc kubenswrapper[4715]: I1009 08:02:01.268597 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.029516711 podStartE2EDuration="14.26857305s" podCreationTimestamp="2025-10-09 08:01:47 +0000 UTC" firstStartedPulling="2025-10-09 08:01:49.634731387 +0000 UTC m=+940.327535395" lastFinishedPulling="2025-10-09 08:02:00.873787726 +0000 UTC m=+951.566591734" observedRunningTime="2025-10-09 08:02:01.265024306 +0000 UTC m=+951.957828314" watchObservedRunningTime="2025-10-09 08:02:01.26857305 +0000 UTC m=+951.961377058" Oct 09 08:02:01 crc kubenswrapper[4715]: I1009 08:02:01.271103 4715 generic.go:334] "Generic (PLEG): container finished" podID="72f969cd-b504-4db1-832a-1e0c7f0a3b7b" containerID="c214359d0e45269ee92caf744ce8c673364bb0fb674a55d8e00cc730c463ca76" exitCode=0 Oct 09 08:02:01 crc kubenswrapper[4715]: I1009 08:02:01.271156 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"72f969cd-b504-4db1-832a-1e0c7f0a3b7b","Type":"ContainerDied","Data":"c214359d0e45269ee92caf744ce8c673364bb0fb674a55d8e00cc730c463ca76"} Oct 09 08:02:01 crc kubenswrapper[4715]: I1009 08:02:01.285744 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-gckmt" podStartSLOduration=10.450167551 podStartE2EDuration="16.28572207s" podCreationTimestamp="2025-10-09 08:01:45 +0000 UTC" firstStartedPulling="2025-10-09 08:01:50.715033914 +0000 UTC m=+941.407837922" lastFinishedPulling="2025-10-09 08:01:56.550588433 +0000 UTC m=+947.243392441" observedRunningTime="2025-10-09 08:02:01.285190464 +0000 UTC m=+951.977994492" watchObservedRunningTime="2025-10-09 08:02:01.28572207 +0000 UTC m=+951.978526078" Oct 09 08:02:01 crc kubenswrapper[4715]: I1009 08:02:01.311695 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.018933092 podStartE2EDuration="15.311674857s" podCreationTimestamp="2025-10-09 08:01:46 +0000 UTC" firstStartedPulling="2025-10-09 08:01:49.564855049 +0000 UTC m=+940.257659057" lastFinishedPulling="2025-10-09 08:02:00.857596814 +0000 UTC m=+951.550400822" observedRunningTime="2025-10-09 08:02:01.308073242 +0000 UTC m=+952.000877260" watchObservedRunningTime="2025-10-09 08:02:01.311674857 +0000 UTC m=+952.004478865" Oct 09 08:02:02 crc kubenswrapper[4715]: I1009 08:02:02.281102 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"72f969cd-b504-4db1-832a-1e0c7f0a3b7b","Type":"ContainerStarted","Data":"99b2da1842fa8c76075aaf1cb25620a7c3e3d635c9a0947197f094db18e7d514"} Oct 09 08:02:02 crc kubenswrapper[4715]: I1009 08:02:02.284391 4715 generic.go:334] "Generic (PLEG): container finished" podID="895dedfd-5a74-43a4-81d1-6365aa67ed6a" containerID="eddf9f82b4c1de1166791c581356688f954d78066dca7908d1dc0a08cd8a4b5f" exitCode=0 Oct 09 08:02:02 crc kubenswrapper[4715]: I1009 08:02:02.284734 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"895dedfd-5a74-43a4-81d1-6365aa67ed6a","Type":"ContainerDied","Data":"eddf9f82b4c1de1166791c581356688f954d78066dca7908d1dc0a08cd8a4b5f"} Oct 09 08:02:02 crc kubenswrapper[4715]: I1009 08:02:02.315986 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=17.271769426 podStartE2EDuration="24.315963315s" podCreationTimestamp="2025-10-09 08:01:38 +0000 UTC" firstStartedPulling="2025-10-09 08:01:49.15993735 +0000 UTC m=+939.852741358" lastFinishedPulling="2025-10-09 08:01:56.204131239 +0000 UTC m=+946.896935247" observedRunningTime="2025-10-09 08:02:02.308464517 +0000 UTC m=+953.001268735" watchObservedRunningTime="2025-10-09 08:02:02.315963315 +0000 UTC m=+953.008767333" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.188229 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.188682 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.236337 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.293028 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"895dedfd-5a74-43a4-81d1-6365aa67ed6a","Type":"ContainerStarted","Data":"c83e68e435253c048f0b40c8517b90cd3bfdd8c68031392b77f62d78879502eb"} Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.327401 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.826782274 podStartE2EDuration="26.327373003s" podCreationTimestamp="2025-10-09 08:01:37 +0000 UTC" firstStartedPulling="2025-10-09 08:01:49.051523888 +0000 UTC m=+939.744327896" lastFinishedPulling="2025-10-09 08:01:56.552114617 +0000 UTC m=+947.244918625" observedRunningTime="2025-10-09 08:02:03.321702657 +0000 UTC m=+954.014506675" watchObservedRunningTime="2025-10-09 08:02:03.327373003 +0000 UTC m=+954.020177011" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.333701 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.416633 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.416819 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.465927 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.599106 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-kdw6l"] Oct 09 08:02:03 crc kubenswrapper[4715]: E1009 08:02:03.600217 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71e26eb-6d9c-40d9-8960-a6d1ba76f08c" containerName="init" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.600244 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71e26eb-6d9c-40d9-8960-a6d1ba76f08c" containerName="init" Oct 09 08:02:03 crc kubenswrapper[4715]: E1009 08:02:03.600282 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a71e26eb-6d9c-40d9-8960-a6d1ba76f08c" containerName="dnsmasq-dns" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.600291 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71e26eb-6d9c-40d9-8960-a6d1ba76f08c" containerName="dnsmasq-dns" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.600501 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="a71e26eb-6d9c-40d9-8960-a6d1ba76f08c" containerName="dnsmasq-dns" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.601387 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.603826 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.615086 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-kdw6l"] Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.655473 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-ptkjm"] Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.658647 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.661552 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.663253 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ptkjm"] Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.718694 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-kdw6l\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.718790 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-kdw6l\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.718815 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7lhv\" (UniqueName: \"kubernetes.io/projected/1532aa52-3521-43d8-9f26-18027dbd6919-kube-api-access-h7lhv\") pod \"dnsmasq-dns-7fd796d7df-kdw6l\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.718855 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-config\") pod \"dnsmasq-dns-7fd796d7df-kdw6l\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: E1009 08:02:03.769449 4715 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.158:33338->38.102.83.158:33265: write tcp 38.102.83.158:33338->38.102.83.158:33265: write: broken pipe Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.820646 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-kdw6l\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.820837 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-ovs-rundir\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.820923 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7lhv\" (UniqueName: \"kubernetes.io/projected/1532aa52-3521-43d8-9f26-18027dbd6919-kube-api-access-h7lhv\") pod \"dnsmasq-dns-7fd796d7df-kdw6l\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.821025 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-config\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.821102 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-config\") pod \"dnsmasq-dns-7fd796d7df-kdw6l\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.821178 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwj5b\" (UniqueName: \"kubernetes.io/projected/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-kube-api-access-mwj5b\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.821251 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.821333 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-ovn-rundir\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.821406 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-combined-ca-bundle\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.821525 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-kdw6l\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.822535 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-config\") pod \"dnsmasq-dns-7fd796d7df-kdw6l\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.822671 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-kdw6l\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.822898 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-kdw6l\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.841525 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7lhv\" (UniqueName: \"kubernetes.io/projected/1532aa52-3521-43d8-9f26-18027dbd6919-kube-api-access-h7lhv\") pod \"dnsmasq-dns-7fd796d7df-kdw6l\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.920385 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.923526 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-ovs-rundir\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.923888 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-ovs-rundir\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.923931 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-config\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.923968 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwj5b\" (UniqueName: \"kubernetes.io/projected/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-kube-api-access-mwj5b\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.923990 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.924018 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-ovn-rundir\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.924037 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-combined-ca-bundle\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.924741 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-config\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.924820 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-ovn-rundir\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.927356 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-combined-ca-bundle\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.927507 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.947540 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-kdw6l"] Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.958673 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwj5b\" (UniqueName: \"kubernetes.io/projected/f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2-kube-api-access-mwj5b\") pod \"ovn-controller-metrics-ptkjm\" (UID: \"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2\") " pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.982950 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-278vs"] Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.988006 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.990151 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 09 08:02:03 crc kubenswrapper[4715]: I1009 08:02:03.997004 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-ptkjm" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.001214 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-278vs"] Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.140365 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48jnp\" (UniqueName: \"kubernetes.io/projected/6cfc3117-277f-4c29-b4ff-f5260af4012e-kube-api-access-48jnp\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.140625 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.140675 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.140757 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.140854 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-config\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.261789 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48jnp\" (UniqueName: \"kubernetes.io/projected/6cfc3117-277f-4c29-b4ff-f5260af4012e-kube-api-access-48jnp\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.261849 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.261900 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.261946 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.262019 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-config\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.262853 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-config\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.264354 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.265438 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.265852 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.298993 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48jnp\" (UniqueName: \"kubernetes.io/projected/6cfc3117-277f-4c29-b4ff-f5260af4012e-kube-api-access-48jnp\") pod \"dnsmasq-dns-86db49b7ff-278vs\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.371568 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.376769 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.408143 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-ptkjm"] Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.531263 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-kdw6l"] Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.599130 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.601218 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.604116 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-m9x72" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.604634 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.605497 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.606648 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.606639 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.782211 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d08550c-3489-4838-9aaf-49ecbcac005b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.782286 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d08550c-3489-4838-9aaf-49ecbcac005b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.782305 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d08550c-3489-4838-9aaf-49ecbcac005b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.782365 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d08550c-3489-4838-9aaf-49ecbcac005b-scripts\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.782396 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8lqg\" (UniqueName: \"kubernetes.io/projected/2d08550c-3489-4838-9aaf-49ecbcac005b-kube-api-access-f8lqg\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.782445 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d08550c-3489-4838-9aaf-49ecbcac005b-config\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.782665 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d08550c-3489-4838-9aaf-49ecbcac005b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.884710 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d08550c-3489-4838-9aaf-49ecbcac005b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.884805 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d08550c-3489-4838-9aaf-49ecbcac005b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.884842 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d08550c-3489-4838-9aaf-49ecbcac005b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.884876 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d08550c-3489-4838-9aaf-49ecbcac005b-scripts\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.884909 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8lqg\" (UniqueName: \"kubernetes.io/projected/2d08550c-3489-4838-9aaf-49ecbcac005b-kube-api-access-f8lqg\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.884942 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d08550c-3489-4838-9aaf-49ecbcac005b-config\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.884974 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d08550c-3489-4838-9aaf-49ecbcac005b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.885936 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d08550c-3489-4838-9aaf-49ecbcac005b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.886815 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d08550c-3489-4838-9aaf-49ecbcac005b-config\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.887647 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d08550c-3489-4838-9aaf-49ecbcac005b-scripts\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.891112 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d08550c-3489-4838-9aaf-49ecbcac005b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.891112 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d08550c-3489-4838-9aaf-49ecbcac005b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.891314 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-278vs"] Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.895381 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d08550c-3489-4838-9aaf-49ecbcac005b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.915727 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8lqg\" (UniqueName: \"kubernetes.io/projected/2d08550c-3489-4838-9aaf-49ecbcac005b-kube-api-access-f8lqg\") pod \"ovn-northd-0\" (UID: \"2d08550c-3489-4838-9aaf-49ecbcac005b\") " pod="openstack/ovn-northd-0" Oct 09 08:02:04 crc kubenswrapper[4715]: I1009 08:02:04.932238 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 09 08:02:05 crc kubenswrapper[4715]: I1009 08:02:05.058109 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 09 08:02:05 crc kubenswrapper[4715]: I1009 08:02:05.316715 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" event={"ID":"6cfc3117-277f-4c29-b4ff-f5260af4012e","Type":"ContainerStarted","Data":"19207353cab7c53b3aaa1e01a3a3de0ff3a8753563d8bdce046cb8070e66fe47"} Oct 09 08:02:05 crc kubenswrapper[4715]: I1009 08:02:05.318490 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ptkjm" event={"ID":"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2","Type":"ContainerStarted","Data":"ec895c288abcfd9cc6488c82e857058bbe252ebf2a13d1bc7377219eb3b80fd9"} Oct 09 08:02:05 crc kubenswrapper[4715]: I1009 08:02:05.318536 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-ptkjm" event={"ID":"f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2","Type":"ContainerStarted","Data":"ed6fe014ea9e1479e29b1bdca9539d4830f09c9fee404657b0bda0846d861ddc"} Oct 09 08:02:05 crc kubenswrapper[4715]: I1009 08:02:05.320953 4715 generic.go:334] "Generic (PLEG): container finished" podID="1532aa52-3521-43d8-9f26-18027dbd6919" containerID="3f9b4f9d6df4ef4022081e3ea334b5b0a88e52baceeb50063e464583cb3f5a6b" exitCode=0 Oct 09 08:02:05 crc kubenswrapper[4715]: I1009 08:02:05.321587 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" event={"ID":"1532aa52-3521-43d8-9f26-18027dbd6919","Type":"ContainerDied","Data":"3f9b4f9d6df4ef4022081e3ea334b5b0a88e52baceeb50063e464583cb3f5a6b"} Oct 09 08:02:05 crc kubenswrapper[4715]: I1009 08:02:05.321620 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" event={"ID":"1532aa52-3521-43d8-9f26-18027dbd6919","Type":"ContainerStarted","Data":"d6dc53f9a37d0100db16690d914c4ef7339e4fb59beb561b56d4dde603d5fd53"} Oct 09 08:02:05 crc kubenswrapper[4715]: I1009 08:02:05.338414 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-ptkjm" podStartSLOduration=2.338391783 podStartE2EDuration="2.338391783s" podCreationTimestamp="2025-10-09 08:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:02:05.33245066 +0000 UTC m=+956.025254678" watchObservedRunningTime="2025-10-09 08:02:05.338391783 +0000 UTC m=+956.031195791" Oct 09 08:02:05 crc kubenswrapper[4715]: I1009 08:02:05.416303 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 09 08:02:05 crc kubenswrapper[4715]: W1009 08:02:05.421211 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d08550c_3489_4838_9aaf_49ecbcac005b.slice/crio-6f42aa5ee6bce09d651dc472caea6d03ee5a28cfe1bf2de80ac8227bff2b8bd2 WatchSource:0}: Error finding container 6f42aa5ee6bce09d651dc472caea6d03ee5a28cfe1bf2de80ac8227bff2b8bd2: Status 404 returned error can't find the container with id 6f42aa5ee6bce09d651dc472caea6d03ee5a28cfe1bf2de80ac8227bff2b8bd2 Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.332968 4715 generic.go:334] "Generic (PLEG): container finished" podID="6cfc3117-277f-4c29-b4ff-f5260af4012e" containerID="2346e2213943d87604e9bbd58de6928b2d1b6ef8f11924cd2baca8a1205f2c16" exitCode=0 Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.333069 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" event={"ID":"6cfc3117-277f-4c29-b4ff-f5260af4012e","Type":"ContainerDied","Data":"2346e2213943d87604e9bbd58de6928b2d1b6ef8f11924cd2baca8a1205f2c16"} Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.335709 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2d08550c-3489-4838-9aaf-49ecbcac005b","Type":"ContainerStarted","Data":"6f42aa5ee6bce09d651dc472caea6d03ee5a28cfe1bf2de80ac8227bff2b8bd2"} Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.656671 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.821910 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-config\") pod \"1532aa52-3521-43d8-9f26-18027dbd6919\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.822377 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-dns-svc\") pod \"1532aa52-3521-43d8-9f26-18027dbd6919\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.822479 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-ovsdbserver-nb\") pod \"1532aa52-3521-43d8-9f26-18027dbd6919\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.822552 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7lhv\" (UniqueName: \"kubernetes.io/projected/1532aa52-3521-43d8-9f26-18027dbd6919-kube-api-access-h7lhv\") pod \"1532aa52-3521-43d8-9f26-18027dbd6919\" (UID: \"1532aa52-3521-43d8-9f26-18027dbd6919\") " Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.826683 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1532aa52-3521-43d8-9f26-18027dbd6919-kube-api-access-h7lhv" (OuterVolumeSpecName: "kube-api-access-h7lhv") pod "1532aa52-3521-43d8-9f26-18027dbd6919" (UID: "1532aa52-3521-43d8-9f26-18027dbd6919"). InnerVolumeSpecName "kube-api-access-h7lhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.843568 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-config" (OuterVolumeSpecName: "config") pod "1532aa52-3521-43d8-9f26-18027dbd6919" (UID: "1532aa52-3521-43d8-9f26-18027dbd6919"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.843906 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1532aa52-3521-43d8-9f26-18027dbd6919" (UID: "1532aa52-3521-43d8-9f26-18027dbd6919"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.850187 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1532aa52-3521-43d8-9f26-18027dbd6919" (UID: "1532aa52-3521-43d8-9f26-18027dbd6919"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.924960 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.924994 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.925003 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1532aa52-3521-43d8-9f26-18027dbd6919-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:06 crc kubenswrapper[4715]: I1009 08:02:06.925014 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7lhv\" (UniqueName: \"kubernetes.io/projected/1532aa52-3521-43d8-9f26-18027dbd6919-kube-api-access-h7lhv\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:07 crc kubenswrapper[4715]: I1009 08:02:07.346819 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" event={"ID":"1532aa52-3521-43d8-9f26-18027dbd6919","Type":"ContainerDied","Data":"d6dc53f9a37d0100db16690d914c4ef7339e4fb59beb561b56d4dde603d5fd53"} Oct 09 08:02:07 crc kubenswrapper[4715]: I1009 08:02:07.346913 4715 scope.go:117] "RemoveContainer" containerID="3f9b4f9d6df4ef4022081e3ea334b5b0a88e52baceeb50063e464583cb3f5a6b" Oct 09 08:02:07 crc kubenswrapper[4715]: I1009 08:02:07.347104 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-kdw6l" Oct 09 08:02:07 crc kubenswrapper[4715]: I1009 08:02:07.422780 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-kdw6l"] Oct 09 08:02:07 crc kubenswrapper[4715]: I1009 08:02:07.432148 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-kdw6l"] Oct 09 08:02:08 crc kubenswrapper[4715]: I1009 08:02:08.150814 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1532aa52-3521-43d8-9f26-18027dbd6919" path="/var/lib/kubelet/pods/1532aa52-3521-43d8-9f26-18027dbd6919/volumes" Oct 09 08:02:08 crc kubenswrapper[4715]: I1009 08:02:08.356037 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" event={"ID":"6cfc3117-277f-4c29-b4ff-f5260af4012e","Type":"ContainerStarted","Data":"22964816d15c85b292d0e221cf4f9ffe2bad50b183c022ca4eddec5eb00ebbdb"} Oct 09 08:02:08 crc kubenswrapper[4715]: I1009 08:02:08.448066 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 09 08:02:08 crc kubenswrapper[4715]: I1009 08:02:08.448114 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 09 08:02:09 crc kubenswrapper[4715]: I1009 08:02:09.514896 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 09 08:02:09 crc kubenswrapper[4715]: I1009 08:02:09.514966 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 09 08:02:11 crc kubenswrapper[4715]: I1009 08:02:11.718075 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 09 08:02:11 crc kubenswrapper[4715]: I1009 08:02:11.908356 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-278vs"] Oct 09 08:02:11 crc kubenswrapper[4715]: I1009 08:02:11.980586 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-bwkng"] Oct 09 08:02:11 crc kubenswrapper[4715]: E1009 08:02:11.981295 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1532aa52-3521-43d8-9f26-18027dbd6919" containerName="init" Oct 09 08:02:11 crc kubenswrapper[4715]: I1009 08:02:11.981321 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1532aa52-3521-43d8-9f26-18027dbd6919" containerName="init" Oct 09 08:02:11 crc kubenswrapper[4715]: I1009 08:02:11.981546 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1532aa52-3521-43d8-9f26-18027dbd6919" containerName="init" Oct 09 08:02:11 crc kubenswrapper[4715]: I1009 08:02:11.982636 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:11 crc kubenswrapper[4715]: I1009 08:02:11.998497 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bwkng"] Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.117826 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.117874 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-config\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.117928 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.118190 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw6z6\" (UniqueName: \"kubernetes.io/projected/b91a705b-50c1-40cd-ad0f-3a58b1eca640-kube-api-access-cw6z6\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.118336 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-dns-svc\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.220374 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw6z6\" (UniqueName: \"kubernetes.io/projected/b91a705b-50c1-40cd-ad0f-3a58b1eca640-kube-api-access-cw6z6\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.220502 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-dns-svc\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.220583 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.220631 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-config\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.220661 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.222639 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.222698 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.222860 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-config\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.222997 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-dns-svc\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.250762 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw6z6\" (UniqueName: \"kubernetes.io/projected/b91a705b-50c1-40cd-ad0f-3a58b1eca640-kube-api-access-cw6z6\") pod \"dnsmasq-dns-698758b865-bwkng\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.303400 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.823053 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bwkng"] Oct 09 08:02:12 crc kubenswrapper[4715]: W1009 08:02:12.845673 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb91a705b_50c1_40cd_ad0f_3a58b1eca640.slice/crio-5e4f88ee75b68f391a05edf55811ed64e03f5dafc566ad040065853b328853fb WatchSource:0}: Error finding container 5e4f88ee75b68f391a05edf55811ed64e03f5dafc566ad040065853b328853fb: Status 404 returned error can't find the container with id 5e4f88ee75b68f391a05edf55811ed64e03f5dafc566ad040065853b328853fb Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.870364 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.890934 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.932361 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-8mhs9" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.933222 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.933635 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 09 08:02:12 crc kubenswrapper[4715]: I1009 08:02:12.940154 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.063306 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-lock\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.063396 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.063995 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.064038 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-cache\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.064077 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w84hr\" (UniqueName: \"kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-kube-api-access-w84hr\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.078609 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.127517 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-gjkhs"] Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.128817 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.136973 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.137009 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.137218 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.163622 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gjkhs"] Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.165163 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-lock\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.165231 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.165318 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.165356 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-cache\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.165383 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w84hr\" (UniqueName: \"kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-kube-api-access-w84hr\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.166355 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: E1009 08:02:13.166404 4715 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 08:02:13 crc kubenswrapper[4715]: E1009 08:02:13.166439 4715 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 08:02:13 crc kubenswrapper[4715]: E1009 08:02:13.166498 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift podName:4d6a5f2b-d77d-41c9-8b7d-e2e62c157577 nodeName:}" failed. No retries permitted until 2025-10-09 08:02:13.666476572 +0000 UTC m=+964.359280660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift") pod "swift-storage-0" (UID: "4d6a5f2b-d77d-41c9-8b7d-e2e62c157577") : configmap "swift-ring-files" not found Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.167091 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-cache\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.169517 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.177234 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-lock\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.185630 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w84hr\" (UniqueName: \"kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-kube-api-access-w84hr\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.191679 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.246928 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="72f969cd-b504-4db1-832a-1e0c7f0a3b7b" containerName="galera" probeResult="failure" output=< Oct 09 08:02:13 crc kubenswrapper[4715]: wsrep_local_state_comment (Joined) differs from Synced Oct 09 08:02:13 crc kubenswrapper[4715]: > Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.266749 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kccqv\" (UniqueName: \"kubernetes.io/projected/377b8455-bb97-4be8-977a-191578be267c-kube-api-access-kccqv\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.266810 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-combined-ca-bundle\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.266831 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/377b8455-bb97-4be8-977a-191578be267c-etc-swift\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.266941 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377b8455-bb97-4be8-977a-191578be267c-scripts\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.266970 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-dispersionconf\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.266995 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/377b8455-bb97-4be8-977a-191578be267c-ring-data-devices\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.267017 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-swiftconf\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.368588 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kccqv\" (UniqueName: \"kubernetes.io/projected/377b8455-bb97-4be8-977a-191578be267c-kube-api-access-kccqv\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.369003 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-combined-ca-bundle\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.369032 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/377b8455-bb97-4be8-977a-191578be267c-etc-swift\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.369147 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377b8455-bb97-4be8-977a-191578be267c-scripts\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.369207 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-dispersionconf\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.369247 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/377b8455-bb97-4be8-977a-191578be267c-ring-data-devices\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.369277 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-swiftconf\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.370022 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/377b8455-bb97-4be8-977a-191578be267c-etc-swift\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.370391 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377b8455-bb97-4be8-977a-191578be267c-scripts\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.370574 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/377b8455-bb97-4be8-977a-191578be267c-ring-data-devices\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.372636 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-combined-ca-bundle\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.372805 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-dispersionconf\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.373059 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-swiftconf\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.388599 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kccqv\" (UniqueName: \"kubernetes.io/projected/377b8455-bb97-4be8-977a-191578be267c-kube-api-access-kccqv\") pod \"swift-ring-rebalance-gjkhs\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.416780 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bwkng" event={"ID":"b91a705b-50c1-40cd-ad0f-3a58b1eca640","Type":"ContainerStarted","Data":"5e4f88ee75b68f391a05edf55811ed64e03f5dafc566ad040065853b328853fb"} Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.464019 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.676072 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:13 crc kubenswrapper[4715]: E1009 08:02:13.676970 4715 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 08:02:13 crc kubenswrapper[4715]: E1009 08:02:13.677004 4715 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 08:02:13 crc kubenswrapper[4715]: E1009 08:02:13.677071 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift podName:4d6a5f2b-d77d-41c9-8b7d-e2e62c157577 nodeName:}" failed. No retries permitted until 2025-10-09 08:02:14.677052722 +0000 UTC m=+965.369856750 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift") pod "swift-storage-0" (UID: "4d6a5f2b-d77d-41c9-8b7d-e2e62c157577") : configmap "swift-ring-files" not found Oct 09 08:02:13 crc kubenswrapper[4715]: I1009 08:02:13.962152 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gjkhs"] Oct 09 08:02:14 crc kubenswrapper[4715]: I1009 08:02:14.427117 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gjkhs" event={"ID":"377b8455-bb97-4be8-977a-191578be267c","Type":"ContainerStarted","Data":"a2734577f580fe7e33c1ca6cdd5979106361c69ad8470d9fb944e14d43feb51b"} Oct 09 08:02:14 crc kubenswrapper[4715]: I1009 08:02:14.429147 4715 generic.go:334] "Generic (PLEG): container finished" podID="b91a705b-50c1-40cd-ad0f-3a58b1eca640" containerID="01236f81305c25c2f5019aa77a69eebb4ba31cc67f11852d60b4ab81260d1b89" exitCode=0 Oct 09 08:02:14 crc kubenswrapper[4715]: I1009 08:02:14.429211 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bwkng" event={"ID":"b91a705b-50c1-40cd-ad0f-3a58b1eca640","Type":"ContainerDied","Data":"01236f81305c25c2f5019aa77a69eebb4ba31cc67f11852d60b4ab81260d1b89"} Oct 09 08:02:14 crc kubenswrapper[4715]: I1009 08:02:14.429598 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:14 crc kubenswrapper[4715]: I1009 08:02:14.429714 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" podUID="6cfc3117-277f-4c29-b4ff-f5260af4012e" containerName="dnsmasq-dns" containerID="cri-o://22964816d15c85b292d0e221cf4f9ffe2bad50b183c022ca4eddec5eb00ebbdb" gracePeriod=10 Oct 09 08:02:14 crc kubenswrapper[4715]: I1009 08:02:14.433546 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:14 crc kubenswrapper[4715]: I1009 08:02:14.481038 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" podStartSLOduration=11.48101897 podStartE2EDuration="11.48101897s" podCreationTimestamp="2025-10-09 08:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:02:14.472495231 +0000 UTC m=+965.165299249" watchObservedRunningTime="2025-10-09 08:02:14.48101897 +0000 UTC m=+965.173822978" Oct 09 08:02:14 crc kubenswrapper[4715]: I1009 08:02:14.692829 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:14 crc kubenswrapper[4715]: E1009 08:02:14.692901 4715 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 08:02:14 crc kubenswrapper[4715]: E1009 08:02:14.692936 4715 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 08:02:14 crc kubenswrapper[4715]: E1009 08:02:14.692997 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift podName:4d6a5f2b-d77d-41c9-8b7d-e2e62c157577 nodeName:}" failed. No retries permitted until 2025-10-09 08:02:16.692980252 +0000 UTC m=+967.385784260 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift") pod "swift-storage-0" (UID: "4d6a5f2b-d77d-41c9-8b7d-e2e62c157577") : configmap "swift-ring-files" not found Oct 09 08:02:15 crc kubenswrapper[4715]: I1009 08:02:15.437733 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bwkng" event={"ID":"b91a705b-50c1-40cd-ad0f-3a58b1eca640","Type":"ContainerStarted","Data":"2abf2419f3d47ad5856818abfd9ccf780124d7b0ea3b31ee9ef1857135a2ea0c"} Oct 09 08:02:15 crc kubenswrapper[4715]: I1009 08:02:15.439649 4715 generic.go:334] "Generic (PLEG): container finished" podID="6cfc3117-277f-4c29-b4ff-f5260af4012e" containerID="22964816d15c85b292d0e221cf4f9ffe2bad50b183c022ca4eddec5eb00ebbdb" exitCode=0 Oct 09 08:02:15 crc kubenswrapper[4715]: I1009 08:02:15.439697 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" event={"ID":"6cfc3117-277f-4c29-b4ff-f5260af4012e","Type":"ContainerDied","Data":"22964816d15c85b292d0e221cf4f9ffe2bad50b183c022ca4eddec5eb00ebbdb"} Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.014139 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.119043 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-ovsdbserver-sb\") pod \"6cfc3117-277f-4c29-b4ff-f5260af4012e\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.119138 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-ovsdbserver-nb\") pod \"6cfc3117-277f-4c29-b4ff-f5260af4012e\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.119193 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48jnp\" (UniqueName: \"kubernetes.io/projected/6cfc3117-277f-4c29-b4ff-f5260af4012e-kube-api-access-48jnp\") pod \"6cfc3117-277f-4c29-b4ff-f5260af4012e\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.119285 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-dns-svc\") pod \"6cfc3117-277f-4c29-b4ff-f5260af4012e\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.119326 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-config\") pod \"6cfc3117-277f-4c29-b4ff-f5260af4012e\" (UID: \"6cfc3117-277f-4c29-b4ff-f5260af4012e\") " Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.138536 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cfc3117-277f-4c29-b4ff-f5260af4012e-kube-api-access-48jnp" (OuterVolumeSpecName: "kube-api-access-48jnp") pod "6cfc3117-277f-4c29-b4ff-f5260af4012e" (UID: "6cfc3117-277f-4c29-b4ff-f5260af4012e"). InnerVolumeSpecName "kube-api-access-48jnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.161786 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6cfc3117-277f-4c29-b4ff-f5260af4012e" (UID: "6cfc3117-277f-4c29-b4ff-f5260af4012e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.166613 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6cfc3117-277f-4c29-b4ff-f5260af4012e" (UID: "6cfc3117-277f-4c29-b4ff-f5260af4012e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.168189 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6cfc3117-277f-4c29-b4ff-f5260af4012e" (UID: "6cfc3117-277f-4c29-b4ff-f5260af4012e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.189691 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-config" (OuterVolumeSpecName: "config") pod "6cfc3117-277f-4c29-b4ff-f5260af4012e" (UID: "6cfc3117-277f-4c29-b4ff-f5260af4012e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.222058 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.222100 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.222121 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48jnp\" (UniqueName: \"kubernetes.io/projected/6cfc3117-277f-4c29-b4ff-f5260af4012e-kube-api-access-48jnp\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.222141 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.222159 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfc3117-277f-4c29-b4ff-f5260af4012e-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.451245 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.451504 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-278vs" event={"ID":"6cfc3117-277f-4c29-b4ff-f5260af4012e","Type":"ContainerDied","Data":"19207353cab7c53b3aaa1e01a3a3de0ff3a8753563d8bdce046cb8070e66fe47"} Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.451585 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.451628 4715 scope.go:117] "RemoveContainer" containerID="22964816d15c85b292d0e221cf4f9ffe2bad50b183c022ca4eddec5eb00ebbdb" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.480497 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-bwkng" podStartSLOduration=5.480469601 podStartE2EDuration="5.480469601s" podCreationTimestamp="2025-10-09 08:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:02:16.478588087 +0000 UTC m=+967.171392135" watchObservedRunningTime="2025-10-09 08:02:16.480469601 +0000 UTC m=+967.173273649" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.495604 4715 scope.go:117] "RemoveContainer" containerID="2346e2213943d87604e9bbd58de6928b2d1b6ef8f11924cd2baca8a1205f2c16" Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.511920 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-278vs"] Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.517480 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-278vs"] Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.729969 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:16 crc kubenswrapper[4715]: E1009 08:02:16.730444 4715 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 08:02:16 crc kubenswrapper[4715]: E1009 08:02:16.730466 4715 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 08:02:16 crc kubenswrapper[4715]: E1009 08:02:16.730524 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift podName:4d6a5f2b-d77d-41c9-8b7d-e2e62c157577 nodeName:}" failed. No retries permitted until 2025-10-09 08:02:20.730504694 +0000 UTC m=+971.423308702 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift") pod "swift-storage-0" (UID: "4d6a5f2b-d77d-41c9-8b7d-e2e62c157577") : configmap "swift-ring-files" not found Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.753945 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:02:16 crc kubenswrapper[4715]: I1009 08:02:16.754017 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:02:18 crc kubenswrapper[4715]: I1009 08:02:18.146995 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cfc3117-277f-4c29-b4ff-f5260af4012e" path="/var/lib/kubelet/pods/6cfc3117-277f-4c29-b4ff-f5260af4012e/volumes" Oct 09 08:02:18 crc kubenswrapper[4715]: I1009 08:02:18.380627 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 09 08:02:18 crc kubenswrapper[4715]: I1009 08:02:18.440950 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.490055 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2d08550c-3489-4838-9aaf-49ecbcac005b","Type":"ContainerStarted","Data":"3007be80e8742c47d944d18ab3df19ad2d6a3c89db23ad387d0e0454210be597"} Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.490955 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.490969 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2d08550c-3489-4838-9aaf-49ecbcac005b","Type":"ContainerStarted","Data":"e709de374f7e2c4b92d6262b1a56972a5a2f1cc62ecc364434f9d55c9050f204"} Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.508973 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.563196275 podStartE2EDuration="15.508949786s" podCreationTimestamp="2025-10-09 08:02:04 +0000 UTC" firstStartedPulling="2025-10-09 08:02:05.423543737 +0000 UTC m=+956.116347745" lastFinishedPulling="2025-10-09 08:02:18.369297248 +0000 UTC m=+969.062101256" observedRunningTime="2025-10-09 08:02:19.507685899 +0000 UTC m=+970.200489907" watchObservedRunningTime="2025-10-09 08:02:19.508949786 +0000 UTC m=+970.201753784" Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.560668 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.707725 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-cqpnw"] Oct 09 08:02:19 crc kubenswrapper[4715]: E1009 08:02:19.708084 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfc3117-277f-4c29-b4ff-f5260af4012e" containerName="init" Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.708106 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfc3117-277f-4c29-b4ff-f5260af4012e" containerName="init" Oct 09 08:02:19 crc kubenswrapper[4715]: E1009 08:02:19.708131 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfc3117-277f-4c29-b4ff-f5260af4012e" containerName="dnsmasq-dns" Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.708138 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfc3117-277f-4c29-b4ff-f5260af4012e" containerName="dnsmasq-dns" Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.708293 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cfc3117-277f-4c29-b4ff-f5260af4012e" containerName="dnsmasq-dns" Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.708878 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cqpnw" Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.715001 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cqpnw"] Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.898410 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmn4b\" (UniqueName: \"kubernetes.io/projected/b4186809-cc3c-4cfd-a6f2-4888990a3251-kube-api-access-rmn4b\") pod \"keystone-db-create-cqpnw\" (UID: \"b4186809-cc3c-4cfd-a6f2-4888990a3251\") " pod="openstack/keystone-db-create-cqpnw" Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.967042 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rxhd5"] Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.968487 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rxhd5" Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.973337 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rxhd5"] Oct 09 08:02:19 crc kubenswrapper[4715]: I1009 08:02:19.999783 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmn4b\" (UniqueName: \"kubernetes.io/projected/b4186809-cc3c-4cfd-a6f2-4888990a3251-kube-api-access-rmn4b\") pod \"keystone-db-create-cqpnw\" (UID: \"b4186809-cc3c-4cfd-a6f2-4888990a3251\") " pod="openstack/keystone-db-create-cqpnw" Oct 09 08:02:20 crc kubenswrapper[4715]: I1009 08:02:20.021103 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmn4b\" (UniqueName: \"kubernetes.io/projected/b4186809-cc3c-4cfd-a6f2-4888990a3251-kube-api-access-rmn4b\") pod \"keystone-db-create-cqpnw\" (UID: \"b4186809-cc3c-4cfd-a6f2-4888990a3251\") " pod="openstack/keystone-db-create-cqpnw" Oct 09 08:02:20 crc kubenswrapper[4715]: I1009 08:02:20.027926 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cqpnw" Oct 09 08:02:20 crc kubenswrapper[4715]: I1009 08:02:20.101580 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcq4d\" (UniqueName: \"kubernetes.io/projected/b82dbc30-e8e0-4256-8af6-6536cf1c07f5-kube-api-access-rcq4d\") pod \"placement-db-create-rxhd5\" (UID: \"b82dbc30-e8e0-4256-8af6-6536cf1c07f5\") " pod="openstack/placement-db-create-rxhd5" Oct 09 08:02:20 crc kubenswrapper[4715]: I1009 08:02:20.202648 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcq4d\" (UniqueName: \"kubernetes.io/projected/b82dbc30-e8e0-4256-8af6-6536cf1c07f5-kube-api-access-rcq4d\") pod \"placement-db-create-rxhd5\" (UID: \"b82dbc30-e8e0-4256-8af6-6536cf1c07f5\") " pod="openstack/placement-db-create-rxhd5" Oct 09 08:02:20 crc kubenswrapper[4715]: I1009 08:02:20.219651 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcq4d\" (UniqueName: \"kubernetes.io/projected/b82dbc30-e8e0-4256-8af6-6536cf1c07f5-kube-api-access-rcq4d\") pod \"placement-db-create-rxhd5\" (UID: \"b82dbc30-e8e0-4256-8af6-6536cf1c07f5\") " pod="openstack/placement-db-create-rxhd5" Oct 09 08:02:20 crc kubenswrapper[4715]: I1009 08:02:20.287732 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rxhd5" Oct 09 08:02:20 crc kubenswrapper[4715]: I1009 08:02:20.816611 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:20 crc kubenswrapper[4715]: E1009 08:02:20.816872 4715 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 09 08:02:20 crc kubenswrapper[4715]: E1009 08:02:20.816904 4715 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 09 08:02:20 crc kubenswrapper[4715]: E1009 08:02:20.816976 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift podName:4d6a5f2b-d77d-41c9-8b7d-e2e62c157577 nodeName:}" failed. No retries permitted until 2025-10-09 08:02:28.816952632 +0000 UTC m=+979.509756640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift") pod "swift-storage-0" (UID: "4d6a5f2b-d77d-41c9-8b7d-e2e62c157577") : configmap "swift-ring-files" not found Oct 09 08:02:21 crc kubenswrapper[4715]: I1009 08:02:21.265059 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rxhd5"] Oct 09 08:02:21 crc kubenswrapper[4715]: W1009 08:02:21.266116 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb82dbc30_e8e0_4256_8af6_6536cf1c07f5.slice/crio-2cd09107e88fbe4ec624bab548d8c34009844e5530ace31e94cea0d65b667a52 WatchSource:0}: Error finding container 2cd09107e88fbe4ec624bab548d8c34009844e5530ace31e94cea0d65b667a52: Status 404 returned error can't find the container with id 2cd09107e88fbe4ec624bab548d8c34009844e5530ace31e94cea0d65b667a52 Oct 09 08:02:21 crc kubenswrapper[4715]: I1009 08:02:21.364062 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cqpnw"] Oct 09 08:02:21 crc kubenswrapper[4715]: W1009 08:02:21.367366 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4186809_cc3c_4cfd_a6f2_4888990a3251.slice/crio-d5fe1b7e34ba17014d28cbee25d2be5464c80a861934008e15f1ba9b2bac6e18 WatchSource:0}: Error finding container d5fe1b7e34ba17014d28cbee25d2be5464c80a861934008e15f1ba9b2bac6e18: Status 404 returned error can't find the container with id d5fe1b7e34ba17014d28cbee25d2be5464c80a861934008e15f1ba9b2bac6e18 Oct 09 08:02:21 crc kubenswrapper[4715]: I1009 08:02:21.509202 4715 generic.go:334] "Generic (PLEG): container finished" podID="b82dbc30-e8e0-4256-8af6-6536cf1c07f5" containerID="71f46bcdfeacfda0ed8880092c38bea27ab966f54738066f19ec59c10b218197" exitCode=0 Oct 09 08:02:21 crc kubenswrapper[4715]: I1009 08:02:21.509560 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rxhd5" event={"ID":"b82dbc30-e8e0-4256-8af6-6536cf1c07f5","Type":"ContainerDied","Data":"71f46bcdfeacfda0ed8880092c38bea27ab966f54738066f19ec59c10b218197"} Oct 09 08:02:21 crc kubenswrapper[4715]: I1009 08:02:21.509690 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rxhd5" event={"ID":"b82dbc30-e8e0-4256-8af6-6536cf1c07f5","Type":"ContainerStarted","Data":"2cd09107e88fbe4ec624bab548d8c34009844e5530ace31e94cea0d65b667a52"} Oct 09 08:02:21 crc kubenswrapper[4715]: I1009 08:02:21.517230 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gjkhs" event={"ID":"377b8455-bb97-4be8-977a-191578be267c","Type":"ContainerStarted","Data":"79d3e13c7375e0409cf91182a2b9370200e9c211a794313ed4d5b2c47c4b5c69"} Oct 09 08:02:21 crc kubenswrapper[4715]: I1009 08:02:21.524288 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cqpnw" event={"ID":"b4186809-cc3c-4cfd-a6f2-4888990a3251","Type":"ContainerStarted","Data":"66a2e41bcbdbff0126fb609b0fcbb414414310e6985549a5b39618c7b777cf81"} Oct 09 08:02:21 crc kubenswrapper[4715]: I1009 08:02:21.524357 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cqpnw" event={"ID":"b4186809-cc3c-4cfd-a6f2-4888990a3251","Type":"ContainerStarted","Data":"d5fe1b7e34ba17014d28cbee25d2be5464c80a861934008e15f1ba9b2bac6e18"} Oct 09 08:02:21 crc kubenswrapper[4715]: I1009 08:02:21.550537 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-gjkhs" podStartSLOduration=1.600671079 podStartE2EDuration="8.550514406s" podCreationTimestamp="2025-10-09 08:02:13 +0000 UTC" firstStartedPulling="2025-10-09 08:02:13.972643403 +0000 UTC m=+964.665447411" lastFinishedPulling="2025-10-09 08:02:20.92248673 +0000 UTC m=+971.615290738" observedRunningTime="2025-10-09 08:02:21.543359397 +0000 UTC m=+972.236163405" watchObservedRunningTime="2025-10-09 08:02:21.550514406 +0000 UTC m=+972.243318414" Oct 09 08:02:21 crc kubenswrapper[4715]: I1009 08:02:21.558588 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-cqpnw" podStartSLOduration=2.558568561 podStartE2EDuration="2.558568561s" podCreationTimestamp="2025-10-09 08:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:02:21.557703606 +0000 UTC m=+972.250507614" watchObservedRunningTime="2025-10-09 08:02:21.558568561 +0000 UTC m=+972.251372559" Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.304710 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.367955 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rcs9s"] Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.368584 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" podUID="5c8600a7-d97c-4baa-9aec-1e3762af0e69" containerName="dnsmasq-dns" containerID="cri-o://6a23038758906d162356892a84a69088c8648643b34167e4039038ec52f25b6b" gracePeriod=10 Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.535751 4715 generic.go:334] "Generic (PLEG): container finished" podID="5c8600a7-d97c-4baa-9aec-1e3762af0e69" containerID="6a23038758906d162356892a84a69088c8648643b34167e4039038ec52f25b6b" exitCode=0 Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.535845 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" event={"ID":"5c8600a7-d97c-4baa-9aec-1e3762af0e69","Type":"ContainerDied","Data":"6a23038758906d162356892a84a69088c8648643b34167e4039038ec52f25b6b"} Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.538048 4715 generic.go:334] "Generic (PLEG): container finished" podID="b4186809-cc3c-4cfd-a6f2-4888990a3251" containerID="66a2e41bcbdbff0126fb609b0fcbb414414310e6985549a5b39618c7b777cf81" exitCode=0 Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.538157 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cqpnw" event={"ID":"b4186809-cc3c-4cfd-a6f2-4888990a3251","Type":"ContainerDied","Data":"66a2e41bcbdbff0126fb609b0fcbb414414310e6985549a5b39618c7b777cf81"} Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.865049 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.961261 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8600a7-d97c-4baa-9aec-1e3762af0e69-config\") pod \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\" (UID: \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\") " Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.961357 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msvll\" (UniqueName: \"kubernetes.io/projected/5c8600a7-d97c-4baa-9aec-1e3762af0e69-kube-api-access-msvll\") pod \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\" (UID: \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\") " Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.961496 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c8600a7-d97c-4baa-9aec-1e3762af0e69-dns-svc\") pod \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\" (UID: \"5c8600a7-d97c-4baa-9aec-1e3762af0e69\") " Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.962065 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rxhd5" Oct 09 08:02:22 crc kubenswrapper[4715]: I1009 08:02:22.967091 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8600a7-d97c-4baa-9aec-1e3762af0e69-kube-api-access-msvll" (OuterVolumeSpecName: "kube-api-access-msvll") pod "5c8600a7-d97c-4baa-9aec-1e3762af0e69" (UID: "5c8600a7-d97c-4baa-9aec-1e3762af0e69"). InnerVolumeSpecName "kube-api-access-msvll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.009839 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c8600a7-d97c-4baa-9aec-1e3762af0e69-config" (OuterVolumeSpecName: "config") pod "5c8600a7-d97c-4baa-9aec-1e3762af0e69" (UID: "5c8600a7-d97c-4baa-9aec-1e3762af0e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.011016 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c8600a7-d97c-4baa-9aec-1e3762af0e69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5c8600a7-d97c-4baa-9aec-1e3762af0e69" (UID: "5c8600a7-d97c-4baa-9aec-1e3762af0e69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.065860 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcq4d\" (UniqueName: \"kubernetes.io/projected/b82dbc30-e8e0-4256-8af6-6536cf1c07f5-kube-api-access-rcq4d\") pod \"b82dbc30-e8e0-4256-8af6-6536cf1c07f5\" (UID: \"b82dbc30-e8e0-4256-8af6-6536cf1c07f5\") " Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.066587 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c8600a7-d97c-4baa-9aec-1e3762af0e69-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.066619 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c8600a7-d97c-4baa-9aec-1e3762af0e69-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.066690 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msvll\" (UniqueName: \"kubernetes.io/projected/5c8600a7-d97c-4baa-9aec-1e3762af0e69-kube-api-access-msvll\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.071193 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b82dbc30-e8e0-4256-8af6-6536cf1c07f5-kube-api-access-rcq4d" (OuterVolumeSpecName: "kube-api-access-rcq4d") pod "b82dbc30-e8e0-4256-8af6-6536cf1c07f5" (UID: "b82dbc30-e8e0-4256-8af6-6536cf1c07f5"). InnerVolumeSpecName "kube-api-access-rcq4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.169235 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcq4d\" (UniqueName: \"kubernetes.io/projected/b82dbc30-e8e0-4256-8af6-6536cf1c07f5-kube-api-access-rcq4d\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.558194 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" event={"ID":"5c8600a7-d97c-4baa-9aec-1e3762af0e69","Type":"ContainerDied","Data":"99a7b0c5116c3d6b1e2a94dd43561db276080844baddfe3175323fafdf04dc7f"} Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.558616 4715 scope.go:117] "RemoveContainer" containerID="6a23038758906d162356892a84a69088c8648643b34167e4039038ec52f25b6b" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.558253 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rcs9s" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.561650 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rxhd5" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.561820 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rxhd5" event={"ID":"b82dbc30-e8e0-4256-8af6-6536cf1c07f5","Type":"ContainerDied","Data":"2cd09107e88fbe4ec624bab548d8c34009844e5530ace31e94cea0d65b667a52"} Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.561876 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd09107e88fbe4ec624bab548d8c34009844e5530ace31e94cea0d65b667a52" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.600862 4715 scope.go:117] "RemoveContainer" containerID="85d7225914fd7282cda1683bb3a64aab666706db7d7f3dd28c9da8d576346693" Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.616473 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rcs9s"] Oct 09 08:02:23 crc kubenswrapper[4715]: I1009 08:02:23.624382 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rcs9s"] Oct 09 08:02:24 crc kubenswrapper[4715]: I1009 08:02:24.034549 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cqpnw" Oct 09 08:02:24 crc kubenswrapper[4715]: I1009 08:02:24.149199 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c8600a7-d97c-4baa-9aec-1e3762af0e69" path="/var/lib/kubelet/pods/5c8600a7-d97c-4baa-9aec-1e3762af0e69/volumes" Oct 09 08:02:24 crc kubenswrapper[4715]: I1009 08:02:24.190574 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmn4b\" (UniqueName: \"kubernetes.io/projected/b4186809-cc3c-4cfd-a6f2-4888990a3251-kube-api-access-rmn4b\") pod \"b4186809-cc3c-4cfd-a6f2-4888990a3251\" (UID: \"b4186809-cc3c-4cfd-a6f2-4888990a3251\") " Oct 09 08:02:24 crc kubenswrapper[4715]: I1009 08:02:24.203582 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4186809-cc3c-4cfd-a6f2-4888990a3251-kube-api-access-rmn4b" (OuterVolumeSpecName: "kube-api-access-rmn4b") pod "b4186809-cc3c-4cfd-a6f2-4888990a3251" (UID: "b4186809-cc3c-4cfd-a6f2-4888990a3251"). InnerVolumeSpecName "kube-api-access-rmn4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:24 crc kubenswrapper[4715]: I1009 08:02:24.292709 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmn4b\" (UniqueName: \"kubernetes.io/projected/b4186809-cc3c-4cfd-a6f2-4888990a3251-kube-api-access-rmn4b\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:24 crc kubenswrapper[4715]: I1009 08:02:24.572392 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cqpnw" event={"ID":"b4186809-cc3c-4cfd-a6f2-4888990a3251","Type":"ContainerDied","Data":"d5fe1b7e34ba17014d28cbee25d2be5464c80a861934008e15f1ba9b2bac6e18"} Oct 09 08:02:24 crc kubenswrapper[4715]: I1009 08:02:24.572452 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5fe1b7e34ba17014d28cbee25d2be5464c80a861934008e15f1ba9b2bac6e18" Oct 09 08:02:24 crc kubenswrapper[4715]: I1009 08:02:24.572513 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cqpnw" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.220087 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-5gv9h"] Oct 09 08:02:25 crc kubenswrapper[4715]: E1009 08:02:25.220496 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82dbc30-e8e0-4256-8af6-6536cf1c07f5" containerName="mariadb-database-create" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.220520 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82dbc30-e8e0-4256-8af6-6536cf1c07f5" containerName="mariadb-database-create" Oct 09 08:02:25 crc kubenswrapper[4715]: E1009 08:02:25.220532 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8600a7-d97c-4baa-9aec-1e3762af0e69" containerName="dnsmasq-dns" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.220539 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8600a7-d97c-4baa-9aec-1e3762af0e69" containerName="dnsmasq-dns" Oct 09 08:02:25 crc kubenswrapper[4715]: E1009 08:02:25.220558 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8600a7-d97c-4baa-9aec-1e3762af0e69" containerName="init" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.220567 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8600a7-d97c-4baa-9aec-1e3762af0e69" containerName="init" Oct 09 08:02:25 crc kubenswrapper[4715]: E1009 08:02:25.220586 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4186809-cc3c-4cfd-a6f2-4888990a3251" containerName="mariadb-database-create" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.220593 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4186809-cc3c-4cfd-a6f2-4888990a3251" containerName="mariadb-database-create" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.220787 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8600a7-d97c-4baa-9aec-1e3762af0e69" containerName="dnsmasq-dns" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.220803 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b82dbc30-e8e0-4256-8af6-6536cf1c07f5" containerName="mariadb-database-create" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.220810 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4186809-cc3c-4cfd-a6f2-4888990a3251" containerName="mariadb-database-create" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.221452 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5gv9h" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.242044 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5gv9h"] Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.412960 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg6tl\" (UniqueName: \"kubernetes.io/projected/79e51441-6fc1-4841-849a-d17051e1769e-kube-api-access-xg6tl\") pod \"glance-db-create-5gv9h\" (UID: \"79e51441-6fc1-4841-849a-d17051e1769e\") " pod="openstack/glance-db-create-5gv9h" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.514283 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg6tl\" (UniqueName: \"kubernetes.io/projected/79e51441-6fc1-4841-849a-d17051e1769e-kube-api-access-xg6tl\") pod \"glance-db-create-5gv9h\" (UID: \"79e51441-6fc1-4841-849a-d17051e1769e\") " pod="openstack/glance-db-create-5gv9h" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.539345 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg6tl\" (UniqueName: \"kubernetes.io/projected/79e51441-6fc1-4841-849a-d17051e1769e-kube-api-access-xg6tl\") pod \"glance-db-create-5gv9h\" (UID: \"79e51441-6fc1-4841-849a-d17051e1769e\") " pod="openstack/glance-db-create-5gv9h" Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.543160 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5gv9h" Oct 09 08:02:25 crc kubenswrapper[4715]: W1009 08:02:25.990010 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79e51441_6fc1_4841_849a_d17051e1769e.slice/crio-8f1d82938138aa31c5e05a4a2eff338aa5d1acd21690bdf0b152ebeaab1f2c45 WatchSource:0}: Error finding container 8f1d82938138aa31c5e05a4a2eff338aa5d1acd21690bdf0b152ebeaab1f2c45: Status 404 returned error can't find the container with id 8f1d82938138aa31c5e05a4a2eff338aa5d1acd21690bdf0b152ebeaab1f2c45 Oct 09 08:02:25 crc kubenswrapper[4715]: I1009 08:02:25.999587 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5gv9h"] Oct 09 08:02:26 crc kubenswrapper[4715]: I1009 08:02:26.589995 4715 generic.go:334] "Generic (PLEG): container finished" podID="79e51441-6fc1-4841-849a-d17051e1769e" containerID="a077b2411b96192943e7fd4320338b09432b45529a2a7c45615d709a98ed5295" exitCode=0 Oct 09 08:02:26 crc kubenswrapper[4715]: I1009 08:02:26.590048 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5gv9h" event={"ID":"79e51441-6fc1-4841-849a-d17051e1769e","Type":"ContainerDied","Data":"a077b2411b96192943e7fd4320338b09432b45529a2a7c45615d709a98ed5295"} Oct 09 08:02:26 crc kubenswrapper[4715]: I1009 08:02:26.590079 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5gv9h" event={"ID":"79e51441-6fc1-4841-849a-d17051e1769e","Type":"ContainerStarted","Data":"8f1d82938138aa31c5e05a4a2eff338aa5d1acd21690bdf0b152ebeaab1f2c45"} Oct 09 08:02:27 crc kubenswrapper[4715]: I1009 08:02:27.980874 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5gv9h" Oct 09 08:02:28 crc kubenswrapper[4715]: I1009 08:02:28.158596 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg6tl\" (UniqueName: \"kubernetes.io/projected/79e51441-6fc1-4841-849a-d17051e1769e-kube-api-access-xg6tl\") pod \"79e51441-6fc1-4841-849a-d17051e1769e\" (UID: \"79e51441-6fc1-4841-849a-d17051e1769e\") " Oct 09 08:02:28 crc kubenswrapper[4715]: I1009 08:02:28.166415 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e51441-6fc1-4841-849a-d17051e1769e-kube-api-access-xg6tl" (OuterVolumeSpecName: "kube-api-access-xg6tl") pod "79e51441-6fc1-4841-849a-d17051e1769e" (UID: "79e51441-6fc1-4841-849a-d17051e1769e"). InnerVolumeSpecName "kube-api-access-xg6tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:28 crc kubenswrapper[4715]: I1009 08:02:28.261339 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg6tl\" (UniqueName: \"kubernetes.io/projected/79e51441-6fc1-4841-849a-d17051e1769e-kube-api-access-xg6tl\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:28 crc kubenswrapper[4715]: I1009 08:02:28.607449 4715 generic.go:334] "Generic (PLEG): container finished" podID="377b8455-bb97-4be8-977a-191578be267c" containerID="79d3e13c7375e0409cf91182a2b9370200e9c211a794313ed4d5b2c47c4b5c69" exitCode=0 Oct 09 08:02:28 crc kubenswrapper[4715]: I1009 08:02:28.607563 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gjkhs" event={"ID":"377b8455-bb97-4be8-977a-191578be267c","Type":"ContainerDied","Data":"79d3e13c7375e0409cf91182a2b9370200e9c211a794313ed4d5b2c47c4b5c69"} Oct 09 08:02:28 crc kubenswrapper[4715]: I1009 08:02:28.609837 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5gv9h" event={"ID":"79e51441-6fc1-4841-849a-d17051e1769e","Type":"ContainerDied","Data":"8f1d82938138aa31c5e05a4a2eff338aa5d1acd21690bdf0b152ebeaab1f2c45"} Oct 09 08:02:28 crc kubenswrapper[4715]: I1009 08:02:28.609864 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f1d82938138aa31c5e05a4a2eff338aa5d1acd21690bdf0b152ebeaab1f2c45" Oct 09 08:02:28 crc kubenswrapper[4715]: I1009 08:02:28.609917 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5gv9h" Oct 09 08:02:28 crc kubenswrapper[4715]: I1009 08:02:28.870710 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:28 crc kubenswrapper[4715]: I1009 08:02:28.878628 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d6a5f2b-d77d-41c9-8b7d-e2e62c157577-etc-swift\") pod \"swift-storage-0\" (UID: \"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577\") " pod="openstack/swift-storage-0" Oct 09 08:02:28 crc kubenswrapper[4715]: I1009 08:02:28.980494 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 09 08:02:29 crc kubenswrapper[4715]: I1009 08:02:29.512316 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 09 08:02:29 crc kubenswrapper[4715]: I1009 08:02:29.617840 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"1826e444d03a2d8fc0cd8bc597ddf794ca2efe36ef59ca98e106ee1a083df0c7"} Oct 09 08:02:29 crc kubenswrapper[4715]: I1009 08:02:29.736816 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4ed7-account-create-s26dw"] Oct 09 08:02:29 crc kubenswrapper[4715]: E1009 08:02:29.737391 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e51441-6fc1-4841-849a-d17051e1769e" containerName="mariadb-database-create" Oct 09 08:02:29 crc kubenswrapper[4715]: I1009 08:02:29.737410 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e51441-6fc1-4841-849a-d17051e1769e" containerName="mariadb-database-create" Oct 09 08:02:29 crc kubenswrapper[4715]: I1009 08:02:29.737700 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e51441-6fc1-4841-849a-d17051e1769e" containerName="mariadb-database-create" Oct 09 08:02:29 crc kubenswrapper[4715]: I1009 08:02:29.738318 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4ed7-account-create-s26dw" Oct 09 08:02:29 crc kubenswrapper[4715]: I1009 08:02:29.740840 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 09 08:02:29 crc kubenswrapper[4715]: I1009 08:02:29.743706 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4ed7-account-create-s26dw"] Oct 09 08:02:29 crc kubenswrapper[4715]: I1009 08:02:29.887934 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxj6f\" (UniqueName: \"kubernetes.io/projected/6da55d5b-0f99-4b96-9e08-628c1961d8e8-kube-api-access-jxj6f\") pod \"keystone-4ed7-account-create-s26dw\" (UID: \"6da55d5b-0f99-4b96-9e08-628c1961d8e8\") " pod="openstack/keystone-4ed7-account-create-s26dw" Oct 09 08:02:29 crc kubenswrapper[4715]: I1009 08:02:29.909994 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:29 crc kubenswrapper[4715]: I1009 08:02:29.990068 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxj6f\" (UniqueName: \"kubernetes.io/projected/6da55d5b-0f99-4b96-9e08-628c1961d8e8-kube-api-access-jxj6f\") pod \"keystone-4ed7-account-create-s26dw\" (UID: \"6da55d5b-0f99-4b96-9e08-628c1961d8e8\") " pod="openstack/keystone-4ed7-account-create-s26dw" Oct 09 08:02:29 crc kubenswrapper[4715]: I1009 08:02:29.993462 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.009822 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxj6f\" (UniqueName: \"kubernetes.io/projected/6da55d5b-0f99-4b96-9e08-628c1961d8e8-kube-api-access-jxj6f\") pod \"keystone-4ed7-account-create-s26dw\" (UID: \"6da55d5b-0f99-4b96-9e08-628c1961d8e8\") " pod="openstack/keystone-4ed7-account-create-s26dw" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.059656 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4ed7-account-create-s26dw" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.091577 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-combined-ca-bundle\") pod \"377b8455-bb97-4be8-977a-191578be267c\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.091633 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/377b8455-bb97-4be8-977a-191578be267c-ring-data-devices\") pod \"377b8455-bb97-4be8-977a-191578be267c\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.091730 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-dispersionconf\") pod \"377b8455-bb97-4be8-977a-191578be267c\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.091768 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/377b8455-bb97-4be8-977a-191578be267c-etc-swift\") pod \"377b8455-bb97-4be8-977a-191578be267c\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.091788 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377b8455-bb97-4be8-977a-191578be267c-scripts\") pod \"377b8455-bb97-4be8-977a-191578be267c\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.091845 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kccqv\" (UniqueName: \"kubernetes.io/projected/377b8455-bb97-4be8-977a-191578be267c-kube-api-access-kccqv\") pod \"377b8455-bb97-4be8-977a-191578be267c\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.091920 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-swiftconf\") pod \"377b8455-bb97-4be8-977a-191578be267c\" (UID: \"377b8455-bb97-4be8-977a-191578be267c\") " Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.099067 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377b8455-bb97-4be8-977a-191578be267c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "377b8455-bb97-4be8-977a-191578be267c" (UID: "377b8455-bb97-4be8-977a-191578be267c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.102811 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/377b8455-bb97-4be8-977a-191578be267c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "377b8455-bb97-4be8-977a-191578be267c" (UID: "377b8455-bb97-4be8-977a-191578be267c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.103295 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377b8455-bb97-4be8-977a-191578be267c-kube-api-access-kccqv" (OuterVolumeSpecName: "kube-api-access-kccqv") pod "377b8455-bb97-4be8-977a-191578be267c" (UID: "377b8455-bb97-4be8-977a-191578be267c"). InnerVolumeSpecName "kube-api-access-kccqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.133667 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377b8455-bb97-4be8-977a-191578be267c-scripts" (OuterVolumeSpecName: "scripts") pod "377b8455-bb97-4be8-977a-191578be267c" (UID: "377b8455-bb97-4be8-977a-191578be267c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.141057 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "377b8455-bb97-4be8-977a-191578be267c" (UID: "377b8455-bb97-4be8-977a-191578be267c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.143698 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "377b8455-bb97-4be8-977a-191578be267c" (UID: "377b8455-bb97-4be8-977a-191578be267c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.174585 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "377b8455-bb97-4be8-977a-191578be267c" (UID: "377b8455-bb97-4be8-977a-191578be267c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.190969 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5432-account-create-9gr67"] Oct 09 08:02:30 crc kubenswrapper[4715]: E1009 08:02:30.192231 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377b8455-bb97-4be8-977a-191578be267c" containerName="swift-ring-rebalance" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.192284 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="377b8455-bb97-4be8-977a-191578be267c" containerName="swift-ring-rebalance" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.193660 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="377b8455-bb97-4be8-977a-191578be267c" containerName="swift-ring-rebalance" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.195237 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5432-account-create-9gr67"] Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.199949 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5432-account-create-9gr67" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.203031 4715 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.203059 4715 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/377b8455-bb97-4be8-977a-191578be267c-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.203071 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/377b8455-bb97-4be8-977a-191578be267c-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.203086 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kccqv\" (UniqueName: \"kubernetes.io/projected/377b8455-bb97-4be8-977a-191578be267c-kube-api-access-kccqv\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.203117 4715 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.203126 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377b8455-bb97-4be8-977a-191578be267c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.203135 4715 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/377b8455-bb97-4be8-977a-191578be267c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.212279 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.305321 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfghv\" (UniqueName: \"kubernetes.io/projected/f0c78197-391b-4ebc-bc19-5bd09c64f99c-kube-api-access-xfghv\") pod \"placement-5432-account-create-9gr67\" (UID: \"f0c78197-391b-4ebc-bc19-5bd09c64f99c\") " pod="openstack/placement-5432-account-create-9gr67" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.408173 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfghv\" (UniqueName: \"kubernetes.io/projected/f0c78197-391b-4ebc-bc19-5bd09c64f99c-kube-api-access-xfghv\") pod \"placement-5432-account-create-9gr67\" (UID: \"f0c78197-391b-4ebc-bc19-5bd09c64f99c\") " pod="openstack/placement-5432-account-create-9gr67" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.427521 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfghv\" (UniqueName: \"kubernetes.io/projected/f0c78197-391b-4ebc-bc19-5bd09c64f99c-kube-api-access-xfghv\") pod \"placement-5432-account-create-9gr67\" (UID: \"f0c78197-391b-4ebc-bc19-5bd09c64f99c\") " pod="openstack/placement-5432-account-create-9gr67" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.544311 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5432-account-create-9gr67" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.552228 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4ed7-account-create-s26dw"] Oct 09 08:02:30 crc kubenswrapper[4715]: W1009 08:02:30.565286 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6da55d5b_0f99_4b96_9e08_628c1961d8e8.slice/crio-f32d95a1ea583f56e59754cf3ab7025d694e09cbbdfab33ef2a55adfc29a9a46 WatchSource:0}: Error finding container f32d95a1ea583f56e59754cf3ab7025d694e09cbbdfab33ef2a55adfc29a9a46: Status 404 returned error can't find the container with id f32d95a1ea583f56e59754cf3ab7025d694e09cbbdfab33ef2a55adfc29a9a46 Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.626795 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-gjkhs" event={"ID":"377b8455-bb97-4be8-977a-191578be267c","Type":"ContainerDied","Data":"a2734577f580fe7e33c1ca6cdd5979106361c69ad8470d9fb944e14d43feb51b"} Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.626844 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2734577f580fe7e33c1ca6cdd5979106361c69ad8470d9fb944e14d43feb51b" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.626923 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gjkhs" Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.644814 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4ed7-account-create-s26dw" event={"ID":"6da55d5b-0f99-4b96-9e08-628c1961d8e8","Type":"ContainerStarted","Data":"f32d95a1ea583f56e59754cf3ab7025d694e09cbbdfab33ef2a55adfc29a9a46"} Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.653222 4715 generic.go:334] "Generic (PLEG): container finished" podID="1673772c-a772-4ad8-85c3-f68268965d4b" containerID="75a982e1d44680c2b96f722a9924ebc5c281f5c8f11940251e6f50af37404454" exitCode=0 Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.653262 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1673772c-a772-4ad8-85c3-f68268965d4b","Type":"ContainerDied","Data":"75a982e1d44680c2b96f722a9924ebc5c281f5c8f11940251e6f50af37404454"} Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.794554 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xfr2w" podUID="2a1f06ac-c4c8-4884-bb1a-360fbaf03adf" containerName="ovn-controller" probeResult="failure" output=< Oct 09 08:02:30 crc kubenswrapper[4715]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 09 08:02:30 crc kubenswrapper[4715]: > Oct 09 08:02:30 crc kubenswrapper[4715]: I1009 08:02:30.838320 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:02:31 crc kubenswrapper[4715]: I1009 08:02:31.053022 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5432-account-create-9gr67"] Oct 09 08:02:31 crc kubenswrapper[4715]: W1009 08:02:31.096702 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0c78197_391b_4ebc_bc19_5bd09c64f99c.slice/crio-b724e662c6dd903a8fe3769a7817fe71a19287040f217da3602a7ef850940c89 WatchSource:0}: Error finding container b724e662c6dd903a8fe3769a7817fe71a19287040f217da3602a7ef850940c89: Status 404 returned error can't find the container with id b724e662c6dd903a8fe3769a7817fe71a19287040f217da3602a7ef850940c89 Oct 09 08:02:31 crc kubenswrapper[4715]: I1009 08:02:31.662908 4715 generic.go:334] "Generic (PLEG): container finished" podID="a4714af0-14ef-4513-ac5e-dbf4aa99079b" containerID="1bd128cd2c654c89d405e27fdb17e74880b73ef2fcd7b473522b9790ccdfeb29" exitCode=0 Oct 09 08:02:31 crc kubenswrapper[4715]: I1009 08:02:31.662982 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a4714af0-14ef-4513-ac5e-dbf4aa99079b","Type":"ContainerDied","Data":"1bd128cd2c654c89d405e27fdb17e74880b73ef2fcd7b473522b9790ccdfeb29"} Oct 09 08:02:31 crc kubenswrapper[4715]: I1009 08:02:31.664379 4715 generic.go:334] "Generic (PLEG): container finished" podID="6da55d5b-0f99-4b96-9e08-628c1961d8e8" containerID="8efb562820210438a8461b6a8c0cbd66afea476d7ead5c767d4176201b05d855" exitCode=0 Oct 09 08:02:31 crc kubenswrapper[4715]: I1009 08:02:31.664452 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4ed7-account-create-s26dw" event={"ID":"6da55d5b-0f99-4b96-9e08-628c1961d8e8","Type":"ContainerDied","Data":"8efb562820210438a8461b6a8c0cbd66afea476d7ead5c767d4176201b05d855"} Oct 09 08:02:31 crc kubenswrapper[4715]: I1009 08:02:31.666885 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1673772c-a772-4ad8-85c3-f68268965d4b","Type":"ContainerStarted","Data":"96111493a6fbd5d69aad7e7bb948021003031322bc366f71e9e97e9d2a638bcc"} Oct 09 08:02:31 crc kubenswrapper[4715]: I1009 08:02:31.667404 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 09 08:02:31 crc kubenswrapper[4715]: I1009 08:02:31.669303 4715 generic.go:334] "Generic (PLEG): container finished" podID="f0c78197-391b-4ebc-bc19-5bd09c64f99c" containerID="4dd6256ff800cfcb991ebadea1a87783fdf26a641665a212aee1f0caa3ae84a4" exitCode=0 Oct 09 08:02:31 crc kubenswrapper[4715]: I1009 08:02:31.669389 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5432-account-create-9gr67" event={"ID":"f0c78197-391b-4ebc-bc19-5bd09c64f99c","Type":"ContainerDied","Data":"4dd6256ff800cfcb991ebadea1a87783fdf26a641665a212aee1f0caa3ae84a4"} Oct 09 08:02:31 crc kubenswrapper[4715]: I1009 08:02:31.669448 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5432-account-create-9gr67" event={"ID":"f0c78197-391b-4ebc-bc19-5bd09c64f99c","Type":"ContainerStarted","Data":"b724e662c6dd903a8fe3769a7817fe71a19287040f217da3602a7ef850940c89"} Oct 09 08:02:31 crc kubenswrapper[4715]: I1009 08:02:31.673109 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"322ade1fc244f268a9ee6cccf58353e03a99c41ae7ef07bb93a7434266a11dcc"} Oct 09 08:02:31 crc kubenswrapper[4715]: I1009 08:02:31.760087 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=49.026358731 podStartE2EDuration="56.76006664s" podCreationTimestamp="2025-10-09 08:01:35 +0000 UTC" firstStartedPulling="2025-10-09 08:01:49.049584811 +0000 UTC m=+939.742388829" lastFinishedPulling="2025-10-09 08:01:56.78329273 +0000 UTC m=+947.476096738" observedRunningTime="2025-10-09 08:02:31.752991554 +0000 UTC m=+982.445795562" watchObservedRunningTime="2025-10-09 08:02:31.76006664 +0000 UTC m=+982.452870648" Oct 09 08:02:32 crc kubenswrapper[4715]: I1009 08:02:32.682926 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a4714af0-14ef-4513-ac5e-dbf4aa99079b","Type":"ContainerStarted","Data":"a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f"} Oct 09 08:02:32 crc kubenswrapper[4715]: I1009 08:02:32.683485 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:02:32 crc kubenswrapper[4715]: I1009 08:02:32.686276 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"912a0328f3b953830599483a614be1bfbaf98a983731a71276298ff9bf2c4c23"} Oct 09 08:02:32 crc kubenswrapper[4715]: I1009 08:02:32.686454 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"d67c5e3978ed81c880db2938d30aa1d6dafe8fe974c6e829dcbe74625515e365"} Oct 09 08:02:32 crc kubenswrapper[4715]: I1009 08:02:32.686590 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"31c8fdcf967b73b465695d31961dc0c1b5ab6a943fa34db7b663a88fb6576483"} Oct 09 08:02:32 crc kubenswrapper[4715]: I1009 08:02:32.716594 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.555987293 podStartE2EDuration="57.716575896s" podCreationTimestamp="2025-10-09 08:01:35 +0000 UTC" firstStartedPulling="2025-10-09 08:01:49.164772911 +0000 UTC m=+939.857576919" lastFinishedPulling="2025-10-09 08:01:56.325361514 +0000 UTC m=+947.018165522" observedRunningTime="2025-10-09 08:02:32.716269917 +0000 UTC m=+983.409073945" watchObservedRunningTime="2025-10-09 08:02:32.716575896 +0000 UTC m=+983.409379904" Oct 09 08:02:32 crc kubenswrapper[4715]: I1009 08:02:32.980987 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4ed7-account-create-s26dw" Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.051965 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxj6f\" (UniqueName: \"kubernetes.io/projected/6da55d5b-0f99-4b96-9e08-628c1961d8e8-kube-api-access-jxj6f\") pod \"6da55d5b-0f99-4b96-9e08-628c1961d8e8\" (UID: \"6da55d5b-0f99-4b96-9e08-628c1961d8e8\") " Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.059826 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da55d5b-0f99-4b96-9e08-628c1961d8e8-kube-api-access-jxj6f" (OuterVolumeSpecName: "kube-api-access-jxj6f") pod "6da55d5b-0f99-4b96-9e08-628c1961d8e8" (UID: "6da55d5b-0f99-4b96-9e08-628c1961d8e8"). InnerVolumeSpecName "kube-api-access-jxj6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.112543 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5432-account-create-9gr67" Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.155000 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxj6f\" (UniqueName: \"kubernetes.io/projected/6da55d5b-0f99-4b96-9e08-628c1961d8e8-kube-api-access-jxj6f\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.256028 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfghv\" (UniqueName: \"kubernetes.io/projected/f0c78197-391b-4ebc-bc19-5bd09c64f99c-kube-api-access-xfghv\") pod \"f0c78197-391b-4ebc-bc19-5bd09c64f99c\" (UID: \"f0c78197-391b-4ebc-bc19-5bd09c64f99c\") " Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.261793 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0c78197-391b-4ebc-bc19-5bd09c64f99c-kube-api-access-xfghv" (OuterVolumeSpecName: "kube-api-access-xfghv") pod "f0c78197-391b-4ebc-bc19-5bd09c64f99c" (UID: "f0c78197-391b-4ebc-bc19-5bd09c64f99c"). InnerVolumeSpecName "kube-api-access-xfghv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.358803 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfghv\" (UniqueName: \"kubernetes.io/projected/f0c78197-391b-4ebc-bc19-5bd09c64f99c-kube-api-access-xfghv\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.702574 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4ed7-account-create-s26dw" event={"ID":"6da55d5b-0f99-4b96-9e08-628c1961d8e8","Type":"ContainerDied","Data":"f32d95a1ea583f56e59754cf3ab7025d694e09cbbdfab33ef2a55adfc29a9a46"} Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.702936 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f32d95a1ea583f56e59754cf3ab7025d694e09cbbdfab33ef2a55adfc29a9a46" Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.702998 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4ed7-account-create-s26dw" Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.704986 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5432-account-create-9gr67" event={"ID":"f0c78197-391b-4ebc-bc19-5bd09c64f99c","Type":"ContainerDied","Data":"b724e662c6dd903a8fe3769a7817fe71a19287040f217da3602a7ef850940c89"} Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.705021 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b724e662c6dd903a8fe3769a7817fe71a19287040f217da3602a7ef850940c89" Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.705068 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5432-account-create-9gr67" Oct 09 08:02:33 crc kubenswrapper[4715]: I1009 08:02:33.722597 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"a4077564ab4e482aca6b7ae8945497da2913693c82c6bd917a35990373d1ff4b"} Oct 09 08:02:33 crc kubenswrapper[4715]: E1009 08:02:33.796684 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6da55d5b_0f99_4b96_9e08_628c1961d8e8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0c78197_391b_4ebc_bc19_5bd09c64f99c.slice\": RecentStats: unable to find data in memory cache]" Oct 09 08:02:34 crc kubenswrapper[4715]: I1009 08:02:34.734716 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"7105b01d01d5ce9c74fad249af26f69a615847c4cd029fb86ea03a67029c1842"} Oct 09 08:02:34 crc kubenswrapper[4715]: I1009 08:02:34.735093 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"cec6648b8909dc8c2b01177008a24e6a872858cd477485777a4a7c02e7431fd6"} Oct 09 08:02:34 crc kubenswrapper[4715]: I1009 08:02:34.735109 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"03d2f0f2946393c0640a92154e56a095898fac694ba26e9191504c01b5388584"} Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.257036 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-52c5-account-create-c6qsf"] Oct 09 08:02:35 crc kubenswrapper[4715]: E1009 08:02:35.257365 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da55d5b-0f99-4b96-9e08-628c1961d8e8" containerName="mariadb-account-create" Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.257379 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da55d5b-0f99-4b96-9e08-628c1961d8e8" containerName="mariadb-account-create" Oct 09 08:02:35 crc kubenswrapper[4715]: E1009 08:02:35.257397 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c78197-391b-4ebc-bc19-5bd09c64f99c" containerName="mariadb-account-create" Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.257404 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c78197-391b-4ebc-bc19-5bd09c64f99c" containerName="mariadb-account-create" Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.257613 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da55d5b-0f99-4b96-9e08-628c1961d8e8" containerName="mariadb-account-create" Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.257651 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0c78197-391b-4ebc-bc19-5bd09c64f99c" containerName="mariadb-account-create" Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.258125 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-52c5-account-create-c6qsf" Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.264899 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.272627 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-52c5-account-create-c6qsf"] Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.394706 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjt96\" (UniqueName: \"kubernetes.io/projected/3c98c9b5-1e23-4f9c-927f-b9b33da85410-kube-api-access-qjt96\") pod \"glance-52c5-account-create-c6qsf\" (UID: \"3c98c9b5-1e23-4f9c-927f-b9b33da85410\") " pod="openstack/glance-52c5-account-create-c6qsf" Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.495987 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjt96\" (UniqueName: \"kubernetes.io/projected/3c98c9b5-1e23-4f9c-927f-b9b33da85410-kube-api-access-qjt96\") pod \"glance-52c5-account-create-c6qsf\" (UID: \"3c98c9b5-1e23-4f9c-927f-b9b33da85410\") " pod="openstack/glance-52c5-account-create-c6qsf" Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.518646 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjt96\" (UniqueName: \"kubernetes.io/projected/3c98c9b5-1e23-4f9c-927f-b9b33da85410-kube-api-access-qjt96\") pod \"glance-52c5-account-create-c6qsf\" (UID: \"3c98c9b5-1e23-4f9c-927f-b9b33da85410\") " pod="openstack/glance-52c5-account-create-c6qsf" Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.577236 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-52c5-account-create-c6qsf" Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.797987 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-xfr2w" podUID="2a1f06ac-c4c8-4884-bb1a-360fbaf03adf" containerName="ovn-controller" probeResult="failure" output=< Oct 09 08:02:35 crc kubenswrapper[4715]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 09 08:02:35 crc kubenswrapper[4715]: > Oct 09 08:02:35 crc kubenswrapper[4715]: I1009 08:02:35.843029 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gckmt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.052334 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-xfr2w-config-p4stt"] Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.053875 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.056132 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.071224 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xfr2w-config-p4stt"] Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.102019 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-52c5-account-create-c6qsf"] Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.214813 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13f2d9ac-679a-40ea-a013-bdeefbf34e72-additional-scripts\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.214989 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5h48\" (UniqueName: \"kubernetes.io/projected/13f2d9ac-679a-40ea-a013-bdeefbf34e72-kube-api-access-d5h48\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.215131 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-log-ovn\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.215211 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13f2d9ac-679a-40ea-a013-bdeefbf34e72-scripts\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.216245 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-run\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.216320 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-run-ovn\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.318310 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13f2d9ac-679a-40ea-a013-bdeefbf34e72-additional-scripts\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.318361 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5h48\" (UniqueName: \"kubernetes.io/projected/13f2d9ac-679a-40ea-a013-bdeefbf34e72-kube-api-access-d5h48\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.318415 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-log-ovn\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.318456 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13f2d9ac-679a-40ea-a013-bdeefbf34e72-scripts\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.318491 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-run\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.318514 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-run-ovn\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.318770 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-run-ovn\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.319004 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13f2d9ac-679a-40ea-a013-bdeefbf34e72-additional-scripts\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.319168 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-run\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.319215 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-log-ovn\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.320880 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13f2d9ac-679a-40ea-a013-bdeefbf34e72-scripts\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: W1009 08:02:36.324378 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c98c9b5_1e23_4f9c_927f_b9b33da85410.slice/crio-b675ccae0881eafaf35dd1eead5099119f7b7f40204d3b443c53cafe03592fa5 WatchSource:0}: Error finding container b675ccae0881eafaf35dd1eead5099119f7b7f40204d3b443c53cafe03592fa5: Status 404 returned error can't find the container with id b675ccae0881eafaf35dd1eead5099119f7b7f40204d3b443c53cafe03592fa5 Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.336462 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5h48\" (UniqueName: \"kubernetes.io/projected/13f2d9ac-679a-40ea-a013-bdeefbf34e72-kube-api-access-d5h48\") pod \"ovn-controller-xfr2w-config-p4stt\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.383237 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.756376 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"4cc2549832a0a863ed661a9223e4c43a7066e93dcd15935c41e184fef901d5b1"} Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.758552 4715 generic.go:334] "Generic (PLEG): container finished" podID="3c98c9b5-1e23-4f9c-927f-b9b33da85410" containerID="ec0d3a8d53ca533d5e6d71c32953b95716c0d242d989bc1a226499e1438bca8b" exitCode=0 Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.758605 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-52c5-account-create-c6qsf" event={"ID":"3c98c9b5-1e23-4f9c-927f-b9b33da85410","Type":"ContainerDied","Data":"ec0d3a8d53ca533d5e6d71c32953b95716c0d242d989bc1a226499e1438bca8b"} Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.758635 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-52c5-account-create-c6qsf" event={"ID":"3c98c9b5-1e23-4f9c-927f-b9b33da85410","Type":"ContainerStarted","Data":"b675ccae0881eafaf35dd1eead5099119f7b7f40204d3b443c53cafe03592fa5"} Oct 09 08:02:36 crc kubenswrapper[4715]: I1009 08:02:36.940800 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-xfr2w-config-p4stt"] Oct 09 08:02:37 crc kubenswrapper[4715]: I1009 08:02:37.773412 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xfr2w-config-p4stt" event={"ID":"13f2d9ac-679a-40ea-a013-bdeefbf34e72","Type":"ContainerStarted","Data":"d0fb21496c5e66aeecbf6a5984fed7d8dab239ddd4fbcf0ff397c5d9c9acb9e6"} Oct 09 08:02:37 crc kubenswrapper[4715]: I1009 08:02:37.773956 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xfr2w-config-p4stt" event={"ID":"13f2d9ac-679a-40ea-a013-bdeefbf34e72","Type":"ContainerStarted","Data":"186dc413fcbce5e5016b2490c8275a435cca97a255346aad8f31d463f5939862"} Oct 09 08:02:37 crc kubenswrapper[4715]: I1009 08:02:37.786624 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"996ac05bbdd37929e004455489f920d9a593c69c74b6c3628287f791d37fd31b"} Oct 09 08:02:37 crc kubenswrapper[4715]: I1009 08:02:37.786803 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"afc1b2b12265a4010aa081d9100dfbe5722d50a7397375ce0d71423a6031635f"} Oct 09 08:02:37 crc kubenswrapper[4715]: I1009 08:02:37.786892 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"5ff6c0b60fbbbc9f30b36f0502a852e9fe657e3fc5f86096b0c2bf148af9881d"} Oct 09 08:02:37 crc kubenswrapper[4715]: I1009 08:02:37.787044 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"34ce859ec1879f90ca785b7950f0d3e998263681a65aa129aec5e01f5b107152"} Oct 09 08:02:37 crc kubenswrapper[4715]: I1009 08:02:37.787124 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"8eb78bd1ffa2bc7d3dc8f8a782823e662a2e574a6111dabb11585a83a492d8c9"} Oct 09 08:02:37 crc kubenswrapper[4715]: I1009 08:02:37.787198 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d6a5f2b-d77d-41c9-8b7d-e2e62c157577","Type":"ContainerStarted","Data":"8b07080a5512ed95bf20718c34480b1b91332d695f952b6fe22c8a0620846781"} Oct 09 08:02:37 crc kubenswrapper[4715]: I1009 08:02:37.807669 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-xfr2w-config-p4stt" podStartSLOduration=1.807650744 podStartE2EDuration="1.807650744s" podCreationTimestamp="2025-10-09 08:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:02:37.805770149 +0000 UTC m=+988.498574177" watchObservedRunningTime="2025-10-09 08:02:37.807650744 +0000 UTC m=+988.500454762" Oct 09 08:02:37 crc kubenswrapper[4715]: I1009 08:02:37.855935 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.005765192 podStartE2EDuration="26.855912412s" podCreationTimestamp="2025-10-09 08:02:11 +0000 UTC" firstStartedPulling="2025-10-09 08:02:29.518094235 +0000 UTC m=+980.210898243" lastFinishedPulling="2025-10-09 08:02:36.368241465 +0000 UTC m=+987.061045463" observedRunningTime="2025-10-09 08:02:37.8520836 +0000 UTC m=+988.544887618" watchObservedRunningTime="2025-10-09 08:02:37.855912412 +0000 UTC m=+988.548716420" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.176278 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vgpch"] Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.177497 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vgpch"] Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.177579 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.180592 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.181152 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-52c5-account-create-c6qsf" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.269268 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjt96\" (UniqueName: \"kubernetes.io/projected/3c98c9b5-1e23-4f9c-927f-b9b33da85410-kube-api-access-qjt96\") pod \"3c98c9b5-1e23-4f9c-927f-b9b33da85410\" (UID: \"3c98c9b5-1e23-4f9c-927f-b9b33da85410\") " Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.269869 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-config\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.269950 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.269983 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.270014 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.270072 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ggh9\" (UniqueName: \"kubernetes.io/projected/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-kube-api-access-2ggh9\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.270117 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.286305 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c98c9b5-1e23-4f9c-927f-b9b33da85410-kube-api-access-qjt96" (OuterVolumeSpecName: "kube-api-access-qjt96") pod "3c98c9b5-1e23-4f9c-927f-b9b33da85410" (UID: "3c98c9b5-1e23-4f9c-927f-b9b33da85410"). InnerVolumeSpecName "kube-api-access-qjt96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.371354 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.371434 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.371493 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ggh9\" (UniqueName: \"kubernetes.io/projected/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-kube-api-access-2ggh9\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.371539 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.371583 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-config\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.371664 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.371851 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjt96\" (UniqueName: \"kubernetes.io/projected/3c98c9b5-1e23-4f9c-927f-b9b33da85410-kube-api-access-qjt96\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.372748 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.372777 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.372817 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.372880 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.372880 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-config\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.389291 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ggh9\" (UniqueName: \"kubernetes.io/projected/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-kube-api-access-2ggh9\") pod \"dnsmasq-dns-77585f5f8c-vgpch\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.508816 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.796591 4715 generic.go:334] "Generic (PLEG): container finished" podID="13f2d9ac-679a-40ea-a013-bdeefbf34e72" containerID="d0fb21496c5e66aeecbf6a5984fed7d8dab239ddd4fbcf0ff397c5d9c9acb9e6" exitCode=0 Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.796695 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xfr2w-config-p4stt" event={"ID":"13f2d9ac-679a-40ea-a013-bdeefbf34e72","Type":"ContainerDied","Data":"d0fb21496c5e66aeecbf6a5984fed7d8dab239ddd4fbcf0ff397c5d9c9acb9e6"} Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.799011 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-52c5-account-create-c6qsf" event={"ID":"3c98c9b5-1e23-4f9c-927f-b9b33da85410","Type":"ContainerDied","Data":"b675ccae0881eafaf35dd1eead5099119f7b7f40204d3b443c53cafe03592fa5"} Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.799042 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b675ccae0881eafaf35dd1eead5099119f7b7f40204d3b443c53cafe03592fa5" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.799046 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-52c5-account-create-c6qsf" Oct 09 08:02:38 crc kubenswrapper[4715]: I1009 08:02:38.943881 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vgpch"] Oct 09 08:02:39 crc kubenswrapper[4715]: I1009 08:02:39.808779 4715 generic.go:334] "Generic (PLEG): container finished" podID="5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" containerID="d4b8a4b383afc3fe244881002dee2297aa56aa3f7f53da26f351a431f4e1a452" exitCode=0 Oct 09 08:02:39 crc kubenswrapper[4715]: I1009 08:02:39.808870 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" event={"ID":"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0","Type":"ContainerDied","Data":"d4b8a4b383afc3fe244881002dee2297aa56aa3f7f53da26f351a431f4e1a452"} Oct 09 08:02:39 crc kubenswrapper[4715]: I1009 08:02:39.808972 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" event={"ID":"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0","Type":"ContainerStarted","Data":"ee0a1678cdc411dee04a7c7b19005c464585333d12e9d74a2604de2d7cc7b626"} Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.175084 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.301892 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-log-ovn\") pod \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.301954 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5h48\" (UniqueName: \"kubernetes.io/projected/13f2d9ac-679a-40ea-a013-bdeefbf34e72-kube-api-access-d5h48\") pod \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.302048 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-run\") pod \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.302051 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "13f2d9ac-679a-40ea-a013-bdeefbf34e72" (UID: "13f2d9ac-679a-40ea-a013-bdeefbf34e72"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.302117 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13f2d9ac-679a-40ea-a013-bdeefbf34e72-additional-scripts\") pod \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.302182 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-run-ovn\") pod \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.302208 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13f2d9ac-679a-40ea-a013-bdeefbf34e72-scripts\") pod \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\" (UID: \"13f2d9ac-679a-40ea-a013-bdeefbf34e72\") " Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.302224 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-run" (OuterVolumeSpecName: "var-run") pod "13f2d9ac-679a-40ea-a013-bdeefbf34e72" (UID: "13f2d9ac-679a-40ea-a013-bdeefbf34e72"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.302280 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "13f2d9ac-679a-40ea-a013-bdeefbf34e72" (UID: "13f2d9ac-679a-40ea-a013-bdeefbf34e72"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.302562 4715 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.302573 4715 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-run\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.302582 4715 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13f2d9ac-679a-40ea-a013-bdeefbf34e72-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.302929 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13f2d9ac-679a-40ea-a013-bdeefbf34e72-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "13f2d9ac-679a-40ea-a013-bdeefbf34e72" (UID: "13f2d9ac-679a-40ea-a013-bdeefbf34e72"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.303168 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13f2d9ac-679a-40ea-a013-bdeefbf34e72-scripts" (OuterVolumeSpecName: "scripts") pod "13f2d9ac-679a-40ea-a013-bdeefbf34e72" (UID: "13f2d9ac-679a-40ea-a013-bdeefbf34e72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.313451 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f2d9ac-679a-40ea-a013-bdeefbf34e72-kube-api-access-d5h48" (OuterVolumeSpecName: "kube-api-access-d5h48") pod "13f2d9ac-679a-40ea-a013-bdeefbf34e72" (UID: "13f2d9ac-679a-40ea-a013-bdeefbf34e72"). InnerVolumeSpecName "kube-api-access-d5h48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.404071 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5h48\" (UniqueName: \"kubernetes.io/projected/13f2d9ac-679a-40ea-a013-bdeefbf34e72-kube-api-access-d5h48\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.404123 4715 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13f2d9ac-679a-40ea-a013-bdeefbf34e72-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.404136 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13f2d9ac-679a-40ea-a013-bdeefbf34e72-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.472822 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wn48b"] Oct 09 08:02:40 crc kubenswrapper[4715]: E1009 08:02:40.473279 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c98c9b5-1e23-4f9c-927f-b9b33da85410" containerName="mariadb-account-create" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.473310 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c98c9b5-1e23-4f9c-927f-b9b33da85410" containerName="mariadb-account-create" Oct 09 08:02:40 crc kubenswrapper[4715]: E1009 08:02:40.473329 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f2d9ac-679a-40ea-a013-bdeefbf34e72" containerName="ovn-config" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.473338 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f2d9ac-679a-40ea-a013-bdeefbf34e72" containerName="ovn-config" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.473591 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f2d9ac-679a-40ea-a013-bdeefbf34e72" containerName="ovn-config" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.473617 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c98c9b5-1e23-4f9c-927f-b9b33da85410" containerName="mariadb-account-create" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.474289 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.476749 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sgxzl" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.479446 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.484866 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wn48b"] Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.606786 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-db-sync-config-data\") pod \"glance-db-sync-wn48b\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.606863 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-config-data\") pod \"glance-db-sync-wn48b\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.607052 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-combined-ca-bundle\") pod \"glance-db-sync-wn48b\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.607089 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6grjv\" (UniqueName: \"kubernetes.io/projected/3ad22838-08c7-4400-b4b0-9cb6d6df6653-kube-api-access-6grjv\") pod \"glance-db-sync-wn48b\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.708572 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-db-sync-config-data\") pod \"glance-db-sync-wn48b\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.708647 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-config-data\") pod \"glance-db-sync-wn48b\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.708688 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-combined-ca-bundle\") pod \"glance-db-sync-wn48b\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.708713 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6grjv\" (UniqueName: \"kubernetes.io/projected/3ad22838-08c7-4400-b4b0-9cb6d6df6653-kube-api-access-6grjv\") pod \"glance-db-sync-wn48b\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.712770 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-config-data\") pod \"glance-db-sync-wn48b\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.715660 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-combined-ca-bundle\") pod \"glance-db-sync-wn48b\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.732208 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-db-sync-config-data\") pod \"glance-db-sync-wn48b\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.739375 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6grjv\" (UniqueName: \"kubernetes.io/projected/3ad22838-08c7-4400-b4b0-9cb6d6df6653-kube-api-access-6grjv\") pod \"glance-db-sync-wn48b\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.792947 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wn48b" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.792960 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-xfr2w" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.861583 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-xfr2w-config-p4stt" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.861705 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-xfr2w-config-p4stt" event={"ID":"13f2d9ac-679a-40ea-a013-bdeefbf34e72","Type":"ContainerDied","Data":"186dc413fcbce5e5016b2490c8275a435cca97a255346aad8f31d463f5939862"} Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.861764 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="186dc413fcbce5e5016b2490c8275a435cca97a255346aad8f31d463f5939862" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.863723 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" event={"ID":"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0","Type":"ContainerStarted","Data":"4be01246678b2d91fb4ffa87373d054e72bc804c1823d277247e4d3cd3b75ecb"} Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.864673 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.890013 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" podStartSLOduration=2.889997229 podStartE2EDuration="2.889997229s" podCreationTimestamp="2025-10-09 08:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:02:40.887538537 +0000 UTC m=+991.580342555" watchObservedRunningTime="2025-10-09 08:02:40.889997229 +0000 UTC m=+991.582801237" Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.957483 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-xfr2w-config-p4stt"] Oct 09 08:02:40 crc kubenswrapper[4715]: I1009 08:02:40.971949 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-xfr2w-config-p4stt"] Oct 09 08:02:41 crc kubenswrapper[4715]: I1009 08:02:41.447015 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wn48b"] Oct 09 08:02:41 crc kubenswrapper[4715]: I1009 08:02:41.874998 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wn48b" event={"ID":"3ad22838-08c7-4400-b4b0-9cb6d6df6653","Type":"ContainerStarted","Data":"00d55c8246fb3be94d5afe8cc861473fb49c0fa2e704705191a941724fe6c2be"} Oct 09 08:02:42 crc kubenswrapper[4715]: I1009 08:02:42.147537 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13f2d9ac-679a-40ea-a013-bdeefbf34e72" path="/var/lib/kubelet/pods/13f2d9ac-679a-40ea-a013-bdeefbf34e72/volumes" Oct 09 08:02:46 crc kubenswrapper[4715]: I1009 08:02:46.688808 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 09 08:02:46 crc kubenswrapper[4715]: I1009 08:02:46.753750 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:02:46 crc kubenswrapper[4715]: I1009 08:02:46.753811 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:02:46 crc kubenswrapper[4715]: I1009 08:02:46.975956 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-nzm2b"] Oct 09 08:02:46 crc kubenswrapper[4715]: I1009 08:02:46.983457 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nzm2b" Oct 09 08:02:46 crc kubenswrapper[4715]: I1009 08:02:46.991230 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nzm2b"] Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.014928 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j59jc\" (UniqueName: \"kubernetes.io/projected/aac330e9-1d1a-4fc8-acb9-03b85148eb00-kube-api-access-j59jc\") pod \"cinder-db-create-nzm2b\" (UID: \"aac330e9-1d1a-4fc8-acb9-03b85148eb00\") " pod="openstack/cinder-db-create-nzm2b" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.015240 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.116622 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j59jc\" (UniqueName: \"kubernetes.io/projected/aac330e9-1d1a-4fc8-acb9-03b85148eb00-kube-api-access-j59jc\") pod \"cinder-db-create-nzm2b\" (UID: \"aac330e9-1d1a-4fc8-acb9-03b85148eb00\") " pod="openstack/cinder-db-create-nzm2b" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.121447 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-p7kb6"] Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.123136 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-p7kb6" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.158803 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-p7kb6"] Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.176600 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j59jc\" (UniqueName: \"kubernetes.io/projected/aac330e9-1d1a-4fc8-acb9-03b85148eb00-kube-api-access-j59jc\") pod \"cinder-db-create-nzm2b\" (UID: \"aac330e9-1d1a-4fc8-acb9-03b85148eb00\") " pod="openstack/cinder-db-create-nzm2b" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.217749 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rnc4\" (UniqueName: \"kubernetes.io/projected/7930b9ef-e4b6-4cb6-a269-d10a9194abfa-kube-api-access-9rnc4\") pod \"barbican-db-create-p7kb6\" (UID: \"7930b9ef-e4b6-4cb6-a269-d10a9194abfa\") " pod="openstack/barbican-db-create-p7kb6" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.314843 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nzm2b" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.322845 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rnc4\" (UniqueName: \"kubernetes.io/projected/7930b9ef-e4b6-4cb6-a269-d10a9194abfa-kube-api-access-9rnc4\") pod \"barbican-db-create-p7kb6\" (UID: \"7930b9ef-e4b6-4cb6-a269-d10a9194abfa\") " pod="openstack/barbican-db-create-p7kb6" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.404704 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rnc4\" (UniqueName: \"kubernetes.io/projected/7930b9ef-e4b6-4cb6-a269-d10a9194abfa-kube-api-access-9rnc4\") pod \"barbican-db-create-p7kb6\" (UID: \"7930b9ef-e4b6-4cb6-a269-d10a9194abfa\") " pod="openstack/barbican-db-create-p7kb6" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.448008 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-p7kb6" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.497788 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-nkdxc"] Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.498980 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.503817 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ht6l7" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.504025 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.504161 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.504302 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.515618 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-gjhcq"] Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.516797 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gjhcq" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.523823 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-nkdxc"] Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.528690 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac15af92-271f-424c-bf54-e42f7771bf99-config-data\") pod \"keystone-db-sync-nkdxc\" (UID: \"ac15af92-271f-424c-bf54-e42f7771bf99\") " pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.528728 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15af92-271f-424c-bf54-e42f7771bf99-combined-ca-bundle\") pod \"keystone-db-sync-nkdxc\" (UID: \"ac15af92-271f-424c-bf54-e42f7771bf99\") " pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.528761 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk6g9\" (UniqueName: \"kubernetes.io/projected/ac15af92-271f-424c-bf54-e42f7771bf99-kube-api-access-hk6g9\") pod \"keystone-db-sync-nkdxc\" (UID: \"ac15af92-271f-424c-bf54-e42f7771bf99\") " pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.528780 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6cr2\" (UniqueName: \"kubernetes.io/projected/44824bfc-2e5a-435c-82f5-7a0b29dca4c3-kube-api-access-s6cr2\") pod \"neutron-db-create-gjhcq\" (UID: \"44824bfc-2e5a-435c-82f5-7a0b29dca4c3\") " pod="openstack/neutron-db-create-gjhcq" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.534563 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gjhcq"] Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.630441 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac15af92-271f-424c-bf54-e42f7771bf99-config-data\") pod \"keystone-db-sync-nkdxc\" (UID: \"ac15af92-271f-424c-bf54-e42f7771bf99\") " pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.630503 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15af92-271f-424c-bf54-e42f7771bf99-combined-ca-bundle\") pod \"keystone-db-sync-nkdxc\" (UID: \"ac15af92-271f-424c-bf54-e42f7771bf99\") " pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.630533 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk6g9\" (UniqueName: \"kubernetes.io/projected/ac15af92-271f-424c-bf54-e42f7771bf99-kube-api-access-hk6g9\") pod \"keystone-db-sync-nkdxc\" (UID: \"ac15af92-271f-424c-bf54-e42f7771bf99\") " pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.630550 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6cr2\" (UniqueName: \"kubernetes.io/projected/44824bfc-2e5a-435c-82f5-7a0b29dca4c3-kube-api-access-s6cr2\") pod \"neutron-db-create-gjhcq\" (UID: \"44824bfc-2e5a-435c-82f5-7a0b29dca4c3\") " pod="openstack/neutron-db-create-gjhcq" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.645615 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15af92-271f-424c-bf54-e42f7771bf99-combined-ca-bundle\") pod \"keystone-db-sync-nkdxc\" (UID: \"ac15af92-271f-424c-bf54-e42f7771bf99\") " pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.649191 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk6g9\" (UniqueName: \"kubernetes.io/projected/ac15af92-271f-424c-bf54-e42f7771bf99-kube-api-access-hk6g9\") pod \"keystone-db-sync-nkdxc\" (UID: \"ac15af92-271f-424c-bf54-e42f7771bf99\") " pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.650092 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac15af92-271f-424c-bf54-e42f7771bf99-config-data\") pod \"keystone-db-sync-nkdxc\" (UID: \"ac15af92-271f-424c-bf54-e42f7771bf99\") " pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.650218 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6cr2\" (UniqueName: \"kubernetes.io/projected/44824bfc-2e5a-435c-82f5-7a0b29dca4c3-kube-api-access-s6cr2\") pod \"neutron-db-create-gjhcq\" (UID: \"44824bfc-2e5a-435c-82f5-7a0b29dca4c3\") " pod="openstack/neutron-db-create-gjhcq" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.842083 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:02:47 crc kubenswrapper[4715]: I1009 08:02:47.853784 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gjhcq" Oct 09 08:02:48 crc kubenswrapper[4715]: I1009 08:02:48.510631 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:02:48 crc kubenswrapper[4715]: I1009 08:02:48.581506 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bwkng"] Oct 09 08:02:48 crc kubenswrapper[4715]: I1009 08:02:48.581724 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-bwkng" podUID="b91a705b-50c1-40cd-ad0f-3a58b1eca640" containerName="dnsmasq-dns" containerID="cri-o://2abf2419f3d47ad5856818abfd9ccf780124d7b0ea3b31ee9ef1857135a2ea0c" gracePeriod=10 Oct 09 08:02:48 crc kubenswrapper[4715]: I1009 08:02:48.938917 4715 generic.go:334] "Generic (PLEG): container finished" podID="b91a705b-50c1-40cd-ad0f-3a58b1eca640" containerID="2abf2419f3d47ad5856818abfd9ccf780124d7b0ea3b31ee9ef1857135a2ea0c" exitCode=0 Oct 09 08:02:48 crc kubenswrapper[4715]: I1009 08:02:48.938971 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bwkng" event={"ID":"b91a705b-50c1-40cd-ad0f-3a58b1eca640","Type":"ContainerDied","Data":"2abf2419f3d47ad5856818abfd9ccf780124d7b0ea3b31ee9ef1857135a2ea0c"} Oct 09 08:02:52 crc kubenswrapper[4715]: I1009 08:02:52.304481 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-bwkng" podUID="b91a705b-50c1-40cd-ad0f-3a58b1eca640" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.800914 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.938117 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw6z6\" (UniqueName: \"kubernetes.io/projected/b91a705b-50c1-40cd-ad0f-3a58b1eca640-kube-api-access-cw6z6\") pod \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.938222 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-dns-svc\") pod \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.938284 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-ovsdbserver-nb\") pod \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.938321 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-config\") pod \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.938896 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-ovsdbserver-sb\") pod \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\" (UID: \"b91a705b-50c1-40cd-ad0f-3a58b1eca640\") " Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.943653 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91a705b-50c1-40cd-ad0f-3a58b1eca640-kube-api-access-cw6z6" (OuterVolumeSpecName: "kube-api-access-cw6z6") pod "b91a705b-50c1-40cd-ad0f-3a58b1eca640" (UID: "b91a705b-50c1-40cd-ad0f-3a58b1eca640"). InnerVolumeSpecName "kube-api-access-cw6z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.988604 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bwkng" event={"ID":"b91a705b-50c1-40cd-ad0f-3a58b1eca640","Type":"ContainerDied","Data":"5e4f88ee75b68f391a05edf55811ed64e03f5dafc566ad040065853b328853fb"} Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.988660 4715 scope.go:117] "RemoveContainer" containerID="2abf2419f3d47ad5856818abfd9ccf780124d7b0ea3b31ee9ef1857135a2ea0c" Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.988694 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bwkng" Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.991016 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b91a705b-50c1-40cd-ad0f-3a58b1eca640" (UID: "b91a705b-50c1-40cd-ad0f-3a58b1eca640"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.993325 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b91a705b-50c1-40cd-ad0f-3a58b1eca640" (UID: "b91a705b-50c1-40cd-ad0f-3a58b1eca640"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.993954 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b91a705b-50c1-40cd-ad0f-3a58b1eca640" (UID: "b91a705b-50c1-40cd-ad0f-3a58b1eca640"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:53 crc kubenswrapper[4715]: I1009 08:02:53.995221 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-config" (OuterVolumeSpecName: "config") pod "b91a705b-50c1-40cd-ad0f-3a58b1eca640" (UID: "b91a705b-50c1-40cd-ad0f-3a58b1eca640"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:02:54 crc kubenswrapper[4715]: I1009 08:02:54.009870 4715 scope.go:117] "RemoveContainer" containerID="01236f81305c25c2f5019aa77a69eebb4ba31cc67f11852d60b4ab81260d1b89" Oct 09 08:02:54 crc kubenswrapper[4715]: I1009 08:02:54.040396 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:54 crc kubenswrapper[4715]: I1009 08:02:54.040443 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:54 crc kubenswrapper[4715]: I1009 08:02:54.040457 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:54 crc kubenswrapper[4715]: I1009 08:02:54.040469 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw6z6\" (UniqueName: \"kubernetes.io/projected/b91a705b-50c1-40cd-ad0f-3a58b1eca640-kube-api-access-cw6z6\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:54 crc kubenswrapper[4715]: I1009 08:02:54.040482 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b91a705b-50c1-40cd-ad0f-3a58b1eca640-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:54 crc kubenswrapper[4715]: I1009 08:02:54.068942 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-nkdxc"] Oct 09 08:02:54 crc kubenswrapper[4715]: I1009 08:02:54.079556 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gjhcq"] Oct 09 08:02:54 crc kubenswrapper[4715]: W1009 08:02:54.086838 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44824bfc_2e5a_435c_82f5_7a0b29dca4c3.slice/crio-13b4133bf1e3f3f5d5ba18843626a38fb204375a86d276b95bed2d36ad4e99c5 WatchSource:0}: Error finding container 13b4133bf1e3f3f5d5ba18843626a38fb204375a86d276b95bed2d36ad4e99c5: Status 404 returned error can't find the container with id 13b4133bf1e3f3f5d5ba18843626a38fb204375a86d276b95bed2d36ad4e99c5 Oct 09 08:02:54 crc kubenswrapper[4715]: I1009 08:02:54.158384 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-p7kb6"] Oct 09 08:02:54 crc kubenswrapper[4715]: I1009 08:02:54.212599 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nzm2b"] Oct 09 08:02:54 crc kubenswrapper[4715]: W1009 08:02:54.217037 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaac330e9_1d1a_4fc8_acb9_03b85148eb00.slice/crio-9076d519ae571d8522dbe38a5443be99e3fa30e604991b53d4788f7cd8596484 WatchSource:0}: Error finding container 9076d519ae571d8522dbe38a5443be99e3fa30e604991b53d4788f7cd8596484: Status 404 returned error can't find the container with id 9076d519ae571d8522dbe38a5443be99e3fa30e604991b53d4788f7cd8596484 Oct 09 08:02:54 crc kubenswrapper[4715]: E1009 08:02:54.256326 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb91a705b_50c1_40cd_ad0f_3a58b1eca640.slice\": RecentStats: unable to find data in memory cache]" Oct 09 08:02:54 crc kubenswrapper[4715]: I1009 08:02:54.319043 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bwkng"] Oct 09 08:02:54 crc kubenswrapper[4715]: I1009 08:02:54.324683 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bwkng"] Oct 09 08:02:55 crc kubenswrapper[4715]: I1009 08:02:55.014070 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wn48b" event={"ID":"3ad22838-08c7-4400-b4b0-9cb6d6df6653","Type":"ContainerStarted","Data":"94122c7ed6b6a46d7853a8e268fa84c9314242446de34cd0a7b378533ebf86f9"} Oct 09 08:02:55 crc kubenswrapper[4715]: I1009 08:02:55.016226 4715 generic.go:334] "Generic (PLEG): container finished" podID="aac330e9-1d1a-4fc8-acb9-03b85148eb00" containerID="5362293e53bcae54507d375fe4f884bf60e37eea4b24605f70f5c7963d927220" exitCode=0 Oct 09 08:02:55 crc kubenswrapper[4715]: I1009 08:02:55.016284 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nzm2b" event={"ID":"aac330e9-1d1a-4fc8-acb9-03b85148eb00","Type":"ContainerDied","Data":"5362293e53bcae54507d375fe4f884bf60e37eea4b24605f70f5c7963d927220"} Oct 09 08:02:55 crc kubenswrapper[4715]: I1009 08:02:55.016304 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nzm2b" event={"ID":"aac330e9-1d1a-4fc8-acb9-03b85148eb00","Type":"ContainerStarted","Data":"9076d519ae571d8522dbe38a5443be99e3fa30e604991b53d4788f7cd8596484"} Oct 09 08:02:55 crc kubenswrapper[4715]: I1009 08:02:55.017816 4715 generic.go:334] "Generic (PLEG): container finished" podID="7930b9ef-e4b6-4cb6-a269-d10a9194abfa" containerID="52d9f7a6990ca3073074b27997407797407e64523142ff946c7a0e14163f10e2" exitCode=0 Oct 09 08:02:55 crc kubenswrapper[4715]: I1009 08:02:55.017876 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-p7kb6" event={"ID":"7930b9ef-e4b6-4cb6-a269-d10a9194abfa","Type":"ContainerDied","Data":"52d9f7a6990ca3073074b27997407797407e64523142ff946c7a0e14163f10e2"} Oct 09 08:02:55 crc kubenswrapper[4715]: I1009 08:02:55.023265 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-p7kb6" event={"ID":"7930b9ef-e4b6-4cb6-a269-d10a9194abfa","Type":"ContainerStarted","Data":"3d1f5f0ab0fccdf82b159df8da175a755a948c4e1f1c455f97ad17f9cb0c17db"} Oct 09 08:02:55 crc kubenswrapper[4715]: I1009 08:02:55.026397 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nkdxc" event={"ID":"ac15af92-271f-424c-bf54-e42f7771bf99","Type":"ContainerStarted","Data":"28a0fb1e232a0acc4a67a4ba1c9941cebeee0630893d5598ac99c5e84606fd52"} Oct 09 08:02:55 crc kubenswrapper[4715]: I1009 08:02:55.027878 4715 generic.go:334] "Generic (PLEG): container finished" podID="44824bfc-2e5a-435c-82f5-7a0b29dca4c3" containerID="e013b45b1aac53ddebccc894959f24e90b5344201a8a01b38545a2e92e4e90cc" exitCode=0 Oct 09 08:02:55 crc kubenswrapper[4715]: I1009 08:02:55.027910 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gjhcq" event={"ID":"44824bfc-2e5a-435c-82f5-7a0b29dca4c3","Type":"ContainerDied","Data":"e013b45b1aac53ddebccc894959f24e90b5344201a8a01b38545a2e92e4e90cc"} Oct 09 08:02:55 crc kubenswrapper[4715]: I1009 08:02:55.027928 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gjhcq" event={"ID":"44824bfc-2e5a-435c-82f5-7a0b29dca4c3","Type":"ContainerStarted","Data":"13b4133bf1e3f3f5d5ba18843626a38fb204375a86d276b95bed2d36ad4e99c5"} Oct 09 08:02:55 crc kubenswrapper[4715]: I1009 08:02:55.041684 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wn48b" podStartSLOduration=2.874530546 podStartE2EDuration="15.041665712s" podCreationTimestamp="2025-10-09 08:02:40 +0000 UTC" firstStartedPulling="2025-10-09 08:02:41.447552299 +0000 UTC m=+992.140356307" lastFinishedPulling="2025-10-09 08:02:53.614687465 +0000 UTC m=+1004.307491473" observedRunningTime="2025-10-09 08:02:55.034844863 +0000 UTC m=+1005.727648871" watchObservedRunningTime="2025-10-09 08:02:55.041665712 +0000 UTC m=+1005.734469720" Oct 09 08:02:56 crc kubenswrapper[4715]: I1009 08:02:56.147540 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b91a705b-50c1-40cd-ad0f-3a58b1eca640" path="/var/lib/kubelet/pods/b91a705b-50c1-40cd-ad0f-3a58b1eca640/volumes" Oct 09 08:02:58 crc kubenswrapper[4715]: I1009 08:02:58.628641 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-p7kb6" Oct 09 08:02:58 crc kubenswrapper[4715]: I1009 08:02:58.653026 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nzm2b" Oct 09 08:02:58 crc kubenswrapper[4715]: I1009 08:02:58.672647 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gjhcq" Oct 09 08:02:58 crc kubenswrapper[4715]: I1009 08:02:58.824815 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j59jc\" (UniqueName: \"kubernetes.io/projected/aac330e9-1d1a-4fc8-acb9-03b85148eb00-kube-api-access-j59jc\") pod \"aac330e9-1d1a-4fc8-acb9-03b85148eb00\" (UID: \"aac330e9-1d1a-4fc8-acb9-03b85148eb00\") " Oct 09 08:02:58 crc kubenswrapper[4715]: I1009 08:02:58.824953 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rnc4\" (UniqueName: \"kubernetes.io/projected/7930b9ef-e4b6-4cb6-a269-d10a9194abfa-kube-api-access-9rnc4\") pod \"7930b9ef-e4b6-4cb6-a269-d10a9194abfa\" (UID: \"7930b9ef-e4b6-4cb6-a269-d10a9194abfa\") " Oct 09 08:02:58 crc kubenswrapper[4715]: I1009 08:02:58.824989 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6cr2\" (UniqueName: \"kubernetes.io/projected/44824bfc-2e5a-435c-82f5-7a0b29dca4c3-kube-api-access-s6cr2\") pod \"44824bfc-2e5a-435c-82f5-7a0b29dca4c3\" (UID: \"44824bfc-2e5a-435c-82f5-7a0b29dca4c3\") " Oct 09 08:02:58 crc kubenswrapper[4715]: I1009 08:02:58.830981 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44824bfc-2e5a-435c-82f5-7a0b29dca4c3-kube-api-access-s6cr2" (OuterVolumeSpecName: "kube-api-access-s6cr2") pod "44824bfc-2e5a-435c-82f5-7a0b29dca4c3" (UID: "44824bfc-2e5a-435c-82f5-7a0b29dca4c3"). InnerVolumeSpecName "kube-api-access-s6cr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:58 crc kubenswrapper[4715]: I1009 08:02:58.831074 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aac330e9-1d1a-4fc8-acb9-03b85148eb00-kube-api-access-j59jc" (OuterVolumeSpecName: "kube-api-access-j59jc") pod "aac330e9-1d1a-4fc8-acb9-03b85148eb00" (UID: "aac330e9-1d1a-4fc8-acb9-03b85148eb00"). InnerVolumeSpecName "kube-api-access-j59jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:58 crc kubenswrapper[4715]: I1009 08:02:58.831077 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7930b9ef-e4b6-4cb6-a269-d10a9194abfa-kube-api-access-9rnc4" (OuterVolumeSpecName: "kube-api-access-9rnc4") pod "7930b9ef-e4b6-4cb6-a269-d10a9194abfa" (UID: "7930b9ef-e4b6-4cb6-a269-d10a9194abfa"). InnerVolumeSpecName "kube-api-access-9rnc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:02:58 crc kubenswrapper[4715]: I1009 08:02:58.927333 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j59jc\" (UniqueName: \"kubernetes.io/projected/aac330e9-1d1a-4fc8-acb9-03b85148eb00-kube-api-access-j59jc\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:58 crc kubenswrapper[4715]: I1009 08:02:58.927385 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rnc4\" (UniqueName: \"kubernetes.io/projected/7930b9ef-e4b6-4cb6-a269-d10a9194abfa-kube-api-access-9rnc4\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:58 crc kubenswrapper[4715]: I1009 08:02:58.927410 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6cr2\" (UniqueName: \"kubernetes.io/projected/44824bfc-2e5a-435c-82f5-7a0b29dca4c3-kube-api-access-s6cr2\") on node \"crc\" DevicePath \"\"" Oct 09 08:02:59 crc kubenswrapper[4715]: I1009 08:02:59.064762 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-p7kb6" event={"ID":"7930b9ef-e4b6-4cb6-a269-d10a9194abfa","Type":"ContainerDied","Data":"3d1f5f0ab0fccdf82b159df8da175a755a948c4e1f1c455f97ad17f9cb0c17db"} Oct 09 08:02:59 crc kubenswrapper[4715]: I1009 08:02:59.064840 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d1f5f0ab0fccdf82b159df8da175a755a948c4e1f1c455f97ad17f9cb0c17db" Oct 09 08:02:59 crc kubenswrapper[4715]: I1009 08:02:59.064947 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-p7kb6" Oct 09 08:02:59 crc kubenswrapper[4715]: I1009 08:02:59.078359 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nkdxc" event={"ID":"ac15af92-271f-424c-bf54-e42f7771bf99","Type":"ContainerStarted","Data":"6db4654dff7741d9067045ddca947ca3695837dda24e2659c14c16467c313775"} Oct 09 08:02:59 crc kubenswrapper[4715]: I1009 08:02:59.100913 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gjhcq" event={"ID":"44824bfc-2e5a-435c-82f5-7a0b29dca4c3","Type":"ContainerDied","Data":"13b4133bf1e3f3f5d5ba18843626a38fb204375a86d276b95bed2d36ad4e99c5"} Oct 09 08:02:59 crc kubenswrapper[4715]: I1009 08:02:59.101137 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13b4133bf1e3f3f5d5ba18843626a38fb204375a86d276b95bed2d36ad4e99c5" Oct 09 08:02:59 crc kubenswrapper[4715]: I1009 08:02:59.101034 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gjhcq" Oct 09 08:02:59 crc kubenswrapper[4715]: I1009 08:02:59.104816 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nzm2b" event={"ID":"aac330e9-1d1a-4fc8-acb9-03b85148eb00","Type":"ContainerDied","Data":"9076d519ae571d8522dbe38a5443be99e3fa30e604991b53d4788f7cd8596484"} Oct 09 08:02:59 crc kubenswrapper[4715]: I1009 08:02:59.104876 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9076d519ae571d8522dbe38a5443be99e3fa30e604991b53d4788f7cd8596484" Oct 09 08:02:59 crc kubenswrapper[4715]: I1009 08:02:59.104974 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nzm2b" Oct 09 08:02:59 crc kubenswrapper[4715]: I1009 08:02:59.115584 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-nkdxc" podStartSLOduration=7.701942534 podStartE2EDuration="12.115563324s" podCreationTimestamp="2025-10-09 08:02:47 +0000 UTC" firstStartedPulling="2025-10-09 08:02:54.081778407 +0000 UTC m=+1004.774582415" lastFinishedPulling="2025-10-09 08:02:58.495399197 +0000 UTC m=+1009.188203205" observedRunningTime="2025-10-09 08:02:59.110547398 +0000 UTC m=+1009.803351426" watchObservedRunningTime="2025-10-09 08:02:59.115563324 +0000 UTC m=+1009.808367332" Oct 09 08:03:03 crc kubenswrapper[4715]: I1009 08:03:03.137531 4715 generic.go:334] "Generic (PLEG): container finished" podID="3ad22838-08c7-4400-b4b0-9cb6d6df6653" containerID="94122c7ed6b6a46d7853a8e268fa84c9314242446de34cd0a7b378533ebf86f9" exitCode=0 Oct 09 08:03:03 crc kubenswrapper[4715]: I1009 08:03:03.137614 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wn48b" event={"ID":"3ad22838-08c7-4400-b4b0-9cb6d6df6653","Type":"ContainerDied","Data":"94122c7ed6b6a46d7853a8e268fa84c9314242446de34cd0a7b378533ebf86f9"} Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.147073 4715 generic.go:334] "Generic (PLEG): container finished" podID="ac15af92-271f-424c-bf54-e42f7771bf99" containerID="6db4654dff7741d9067045ddca947ca3695837dda24e2659c14c16467c313775" exitCode=0 Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.147092 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nkdxc" event={"ID":"ac15af92-271f-424c-bf54-e42f7771bf99","Type":"ContainerDied","Data":"6db4654dff7741d9067045ddca947ca3695837dda24e2659c14c16467c313775"} Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.510327 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wn48b" Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.618144 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6grjv\" (UniqueName: \"kubernetes.io/projected/3ad22838-08c7-4400-b4b0-9cb6d6df6653-kube-api-access-6grjv\") pod \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.618192 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-config-data\") pod \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.618251 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-combined-ca-bundle\") pod \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.618358 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-db-sync-config-data\") pod \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\" (UID: \"3ad22838-08c7-4400-b4b0-9cb6d6df6653\") " Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.624463 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad22838-08c7-4400-b4b0-9cb6d6df6653-kube-api-access-6grjv" (OuterVolumeSpecName: "kube-api-access-6grjv") pod "3ad22838-08c7-4400-b4b0-9cb6d6df6653" (UID: "3ad22838-08c7-4400-b4b0-9cb6d6df6653"). InnerVolumeSpecName "kube-api-access-6grjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.624528 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3ad22838-08c7-4400-b4b0-9cb6d6df6653" (UID: "3ad22838-08c7-4400-b4b0-9cb6d6df6653"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.646978 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ad22838-08c7-4400-b4b0-9cb6d6df6653" (UID: "3ad22838-08c7-4400-b4b0-9cb6d6df6653"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.663944 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-config-data" (OuterVolumeSpecName: "config-data") pod "3ad22838-08c7-4400-b4b0-9cb6d6df6653" (UID: "3ad22838-08c7-4400-b4b0-9cb6d6df6653"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.719625 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6grjv\" (UniqueName: \"kubernetes.io/projected/3ad22838-08c7-4400-b4b0-9cb6d6df6653-kube-api-access-6grjv\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.719657 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.719667 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:04 crc kubenswrapper[4715]: I1009 08:03:04.719674 4715 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ad22838-08c7-4400-b4b0-9cb6d6df6653-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.157129 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wn48b" event={"ID":"3ad22838-08c7-4400-b4b0-9cb6d6df6653","Type":"ContainerDied","Data":"00d55c8246fb3be94d5afe8cc861473fb49c0fa2e704705191a941724fe6c2be"} Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.157469 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00d55c8246fb3be94d5afe8cc861473fb49c0fa2e704705191a941724fe6c2be" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.157229 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wn48b" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.422673 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.513844 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-hj7hk"] Oct 09 08:03:05 crc kubenswrapper[4715]: E1009 08:03:05.514194 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad22838-08c7-4400-b4b0-9cb6d6df6653" containerName="glance-db-sync" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.514206 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad22838-08c7-4400-b4b0-9cb6d6df6653" containerName="glance-db-sync" Oct 09 08:03:05 crc kubenswrapper[4715]: E1009 08:03:05.514227 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44824bfc-2e5a-435c-82f5-7a0b29dca4c3" containerName="mariadb-database-create" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.514235 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="44824bfc-2e5a-435c-82f5-7a0b29dca4c3" containerName="mariadb-database-create" Oct 09 08:03:05 crc kubenswrapper[4715]: E1009 08:03:05.514256 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91a705b-50c1-40cd-ad0f-3a58b1eca640" containerName="dnsmasq-dns" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.515392 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91a705b-50c1-40cd-ad0f-3a58b1eca640" containerName="dnsmasq-dns" Oct 09 08:03:05 crc kubenswrapper[4715]: E1009 08:03:05.515433 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7930b9ef-e4b6-4cb6-a269-d10a9194abfa" containerName="mariadb-database-create" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.515442 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="7930b9ef-e4b6-4cb6-a269-d10a9194abfa" containerName="mariadb-database-create" Oct 09 08:03:05 crc kubenswrapper[4715]: E1009 08:03:05.515460 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac330e9-1d1a-4fc8-acb9-03b85148eb00" containerName="mariadb-database-create" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.515468 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac330e9-1d1a-4fc8-acb9-03b85148eb00" containerName="mariadb-database-create" Oct 09 08:03:05 crc kubenswrapper[4715]: E1009 08:03:05.515477 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b91a705b-50c1-40cd-ad0f-3a58b1eca640" containerName="init" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.515485 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91a705b-50c1-40cd-ad0f-3a58b1eca640" containerName="init" Oct 09 08:03:05 crc kubenswrapper[4715]: E1009 08:03:05.515504 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac15af92-271f-424c-bf54-e42f7771bf99" containerName="keystone-db-sync" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.515510 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac15af92-271f-424c-bf54-e42f7771bf99" containerName="keystone-db-sync" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.515697 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b91a705b-50c1-40cd-ad0f-3a58b1eca640" containerName="dnsmasq-dns" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.515713 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="aac330e9-1d1a-4fc8-acb9-03b85148eb00" containerName="mariadb-database-create" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.515725 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad22838-08c7-4400-b4b0-9cb6d6df6653" containerName="glance-db-sync" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.515736 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac15af92-271f-424c-bf54-e42f7771bf99" containerName="keystone-db-sync" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.515750 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="44824bfc-2e5a-435c-82f5-7a0b29dca4c3" containerName="mariadb-database-create" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.515761 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="7930b9ef-e4b6-4cb6-a269-d10a9194abfa" containerName="mariadb-database-create" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.516749 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.525702 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-hj7hk"] Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.530239 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk6g9\" (UniqueName: \"kubernetes.io/projected/ac15af92-271f-424c-bf54-e42f7771bf99-kube-api-access-hk6g9\") pod \"ac15af92-271f-424c-bf54-e42f7771bf99\" (UID: \"ac15af92-271f-424c-bf54-e42f7771bf99\") " Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.530470 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15af92-271f-424c-bf54-e42f7771bf99-combined-ca-bundle\") pod \"ac15af92-271f-424c-bf54-e42f7771bf99\" (UID: \"ac15af92-271f-424c-bf54-e42f7771bf99\") " Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.530533 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac15af92-271f-424c-bf54-e42f7771bf99-config-data\") pod \"ac15af92-271f-424c-bf54-e42f7771bf99\" (UID: \"ac15af92-271f-424c-bf54-e42f7771bf99\") " Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.538715 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac15af92-271f-424c-bf54-e42f7771bf99-kube-api-access-hk6g9" (OuterVolumeSpecName: "kube-api-access-hk6g9") pod "ac15af92-271f-424c-bf54-e42f7771bf99" (UID: "ac15af92-271f-424c-bf54-e42f7771bf99"). InnerVolumeSpecName "kube-api-access-hk6g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.557553 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15af92-271f-424c-bf54-e42f7771bf99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac15af92-271f-424c-bf54-e42f7771bf99" (UID: "ac15af92-271f-424c-bf54-e42f7771bf99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.599020 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15af92-271f-424c-bf54-e42f7771bf99-config-data" (OuterVolumeSpecName: "config-data") pod "ac15af92-271f-424c-bf54-e42f7771bf99" (UID: "ac15af92-271f-424c-bf54-e42f7771bf99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.632576 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.632619 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.632707 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-config\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.632755 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.632770 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkjwq\" (UniqueName: \"kubernetes.io/projected/95873bd3-7c5a-468c-8477-380a7ae6717f-kube-api-access-hkjwq\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.632786 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.632867 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac15af92-271f-424c-bf54-e42f7771bf99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.632879 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac15af92-271f-424c-bf54-e42f7771bf99-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.632888 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk6g9\" (UniqueName: \"kubernetes.io/projected/ac15af92-271f-424c-bf54-e42f7771bf99-kube-api-access-hk6g9\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.734358 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.734739 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjwq\" (UniqueName: \"kubernetes.io/projected/95873bd3-7c5a-468c-8477-380a7ae6717f-kube-api-access-hkjwq\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.734867 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.735026 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.735122 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.735265 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-config\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.735323 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.736202 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.736316 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-config\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.736562 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.736703 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.753133 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkjwq\" (UniqueName: \"kubernetes.io/projected/95873bd3-7c5a-468c-8477-380a7ae6717f-kube-api-access-hkjwq\") pod \"dnsmasq-dns-7ff5475cc9-hj7hk\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:05 crc kubenswrapper[4715]: I1009 08:03:05.963371 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.169040 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nkdxc" event={"ID":"ac15af92-271f-424c-bf54-e42f7771bf99","Type":"ContainerDied","Data":"28a0fb1e232a0acc4a67a4ba1c9941cebeee0630893d5598ac99c5e84606fd52"} Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.169074 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28a0fb1e232a0acc4a67a4ba1c9941cebeee0630893d5598ac99c5e84606fd52" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.169083 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nkdxc" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.206264 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-hj7hk"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.420644 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-hj7hk"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.466076 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-6dccg"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.468693 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.479147 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-6dccg"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.486291 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gggmh"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.488080 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.492365 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.496076 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ht6l7" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.496439 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.496577 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.499194 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gggmh"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.615169 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-65d86fc4b5-cvwlg"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.622229 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.625696 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.626015 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.626147 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.626280 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mwhm8" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.637032 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65d86fc4b5-cvwlg"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.667827 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.667881 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-credential-keys\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.667910 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2da43a5b-210e-4eeb-bba3-8ee808fbac44-scripts\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.667954 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2da43a5b-210e-4eeb-bba3-8ee808fbac44-config-data\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.667971 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-config\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.667996 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da43a5b-210e-4eeb-bba3-8ee808fbac44-logs\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.668031 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-config-data\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.668064 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-combined-ca-bundle\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.668089 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2da43a5b-210e-4eeb-bba3-8ee808fbac44-horizon-secret-key\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.668134 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.668154 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.668181 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrjb4\" (UniqueName: \"kubernetes.io/projected/9a0480a5-826d-4e8e-95dd-856c2b1b874a-kube-api-access-wrjb4\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.668202 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.668225 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lf7g\" (UniqueName: \"kubernetes.io/projected/2da43a5b-210e-4eeb-bba3-8ee808fbac44-kube-api-access-6lf7g\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.668255 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-scripts\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.668277 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpht5\" (UniqueName: \"kubernetes.io/projected/2db20957-2340-4a65-ac69-ac1842a5a37a-kube-api-access-jpht5\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.668335 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-fernet-keys\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.705685 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.709931 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.712588 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.715110 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.727538 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775075 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2da43a5b-210e-4eeb-bba3-8ee808fbac44-config-data\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775128 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-config\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775166 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da43a5b-210e-4eeb-bba3-8ee808fbac44-logs\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775208 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-config-data\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775243 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-combined-ca-bundle\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775263 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2da43a5b-210e-4eeb-bba3-8ee808fbac44-horizon-secret-key\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775316 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775344 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775375 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrjb4\" (UniqueName: \"kubernetes.io/projected/9a0480a5-826d-4e8e-95dd-856c2b1b874a-kube-api-access-wrjb4\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775397 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775435 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lf7g\" (UniqueName: \"kubernetes.io/projected/2da43a5b-210e-4eeb-bba3-8ee808fbac44-kube-api-access-6lf7g\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775464 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-scripts\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775482 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpht5\" (UniqueName: \"kubernetes.io/projected/2db20957-2340-4a65-ac69-ac1842a5a37a-kube-api-access-jpht5\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775503 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-fernet-keys\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775536 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775559 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-credential-keys\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.775582 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2da43a5b-210e-4eeb-bba3-8ee808fbac44-scripts\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.776905 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2da43a5b-210e-4eeb-bba3-8ee808fbac44-scripts\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.777320 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.777945 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.778515 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2da43a5b-210e-4eeb-bba3-8ee808fbac44-config-data\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.779131 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-config\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.779362 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da43a5b-210e-4eeb-bba3-8ee808fbac44-logs\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.780231 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.781316 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.784056 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-scripts\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.785176 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2da43a5b-210e-4eeb-bba3-8ee808fbac44-horizon-secret-key\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.785287 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-combined-ca-bundle\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.791533 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-credential-keys\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.796565 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-fernet-keys\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.797100 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-config-data\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.804272 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lf7g\" (UniqueName: \"kubernetes.io/projected/2da43a5b-210e-4eeb-bba3-8ee808fbac44-kube-api-access-6lf7g\") pod \"horizon-65d86fc4b5-cvwlg\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.809604 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrjb4\" (UniqueName: \"kubernetes.io/projected/9a0480a5-826d-4e8e-95dd-856c2b1b874a-kube-api-access-wrjb4\") pod \"dnsmasq-dns-5c5cc7c5ff-6dccg\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.813039 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpht5\" (UniqueName: \"kubernetes.io/projected/2db20957-2340-4a65-ac69-ac1842a5a37a-kube-api-access-jpht5\") pod \"keystone-bootstrap-gggmh\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.835780 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.845770 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6bf4d6b49-qs2mn"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.857801 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.858063 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.883062 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bf4d6b49-qs2mn"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.885334 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.885364 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea235d88-0fcd-44e4-ae70-59936a38c6a8-config-data\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.885383 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea235d88-0fcd-44e4-ae70-59936a38c6a8-horizon-secret-key\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.885404 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea235d88-0fcd-44e4-ae70-59936a38c6a8-logs\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.885434 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea235d88-0fcd-44e4-ae70-59936a38c6a8-scripts\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.885452 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-scripts\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.885483 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-log-httpd\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.885502 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kb47\" (UniqueName: \"kubernetes.io/projected/ea235d88-0fcd-44e4-ae70-59936a38c6a8-kube-api-access-8kb47\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.885526 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.885550 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7b4l\" (UniqueName: \"kubernetes.io/projected/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-kube-api-access-g7b4l\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.885566 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-run-httpd\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.885598 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-config-data\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.910628 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.931793 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.931958 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.940704 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.940956 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sgxzl" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.941728 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.950882 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.979552 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-6dccg"] Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.986617 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kb47\" (UniqueName: \"kubernetes.io/projected/ea235d88-0fcd-44e4-ae70-59936a38c6a8-kube-api-access-8kb47\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.986665 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.986697 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7b4l\" (UniqueName: \"kubernetes.io/projected/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-kube-api-access-g7b4l\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.986718 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-run-httpd\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.986755 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-config-data\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.986797 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.986812 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea235d88-0fcd-44e4-ae70-59936a38c6a8-config-data\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.986831 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea235d88-0fcd-44e4-ae70-59936a38c6a8-horizon-secret-key\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.986851 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea235d88-0fcd-44e4-ae70-59936a38c6a8-logs\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.986869 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea235d88-0fcd-44e4-ae70-59936a38c6a8-scripts\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.986885 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-scripts\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.986917 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-log-httpd\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.987959 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-log-httpd\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.988976 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea235d88-0fcd-44e4-ae70-59936a38c6a8-config-data\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.991345 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea235d88-0fcd-44e4-ae70-59936a38c6a8-scripts\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.991655 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea235d88-0fcd-44e4-ae70-59936a38c6a8-logs\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.993012 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-run-httpd\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:06 crc kubenswrapper[4715]: I1009 08:03:06.996866 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:06.999680 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.007207 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-config-data\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.015468 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea235d88-0fcd-44e4-ae70-59936a38c6a8-horizon-secret-key\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.015561 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-scripts\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.082383 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kb47\" (UniqueName: \"kubernetes.io/projected/ea235d88-0fcd-44e4-ae70-59936a38c6a8-kube-api-access-8kb47\") pod \"horizon-6bf4d6b49-qs2mn\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.094365 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/992626c1-d8c1-41f1-a0b2-60f72f8b182a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.094655 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-scripts\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.094742 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.094766 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqvz5\" (UniqueName: \"kubernetes.io/projected/992626c1-d8c1-41f1-a0b2-60f72f8b182a-kube-api-access-qqvz5\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.094803 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-config-data\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.094834 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992626c1-d8c1-41f1-a0b2-60f72f8b182a-logs\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.094863 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.095266 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7b4l\" (UniqueName: \"kubernetes.io/projected/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-kube-api-access-g7b4l\") pod \"ceilometer-0\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " pod="openstack/ceilometer-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.122489 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-4tnj8"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.123792 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.126547 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gr7d6" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.126766 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.141112 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-7fpxg"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.143166 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.146028 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.161686 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4tnj8"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.195763 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.199554 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.199605 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqvz5\" (UniqueName: \"kubernetes.io/projected/992626c1-d8c1-41f1-a0b2-60f72f8b182a-kube-api-access-qqvz5\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.199644 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkwgr\" (UniqueName: \"kubernetes.io/projected/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-kube-api-access-nkwgr\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.199747 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-config-data\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.199789 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992626c1-d8c1-41f1-a0b2-60f72f8b182a-logs\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.199832 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.199849 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b92vw\" (UniqueName: \"kubernetes.io/projected/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-kube-api-access-b92vw\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.199872 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.199915 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/992626c1-d8c1-41f1-a0b2-60f72f8b182a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.199933 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.199952 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-logs\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.199969 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-combined-ca-bundle\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.200041 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-scripts\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.212666 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.200086 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.212891 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-config-data\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.212924 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-scripts\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.212939 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-config\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.221672 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992626c1-d8c1-41f1-a0b2-60f72f8b182a-logs\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.223973 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/992626c1-d8c1-41f1-a0b2-60f72f8b182a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.225786 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.227844 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-scripts\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.228743 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-config-data\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.231842 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqvz5\" (UniqueName: \"kubernetes.io/projected/992626c1-d8c1-41f1-a0b2-60f72f8b182a-kube-api-access-qqvz5\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.231892 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-7fpxg"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.238186 4715 generic.go:334] "Generic (PLEG): container finished" podID="95873bd3-7c5a-468c-8477-380a7ae6717f" containerID="42eddf2e022660a034820866d4da8ecb926829de04edb3b2f88a37201954cc7d" exitCode=0 Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.238229 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" event={"ID":"95873bd3-7c5a-468c-8477-380a7ae6717f","Type":"ContainerDied","Data":"42eddf2e022660a034820866d4da8ecb926829de04edb3b2f88a37201954cc7d"} Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.238252 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" event={"ID":"95873bd3-7c5a-468c-8477-380a7ae6717f","Type":"ContainerStarted","Data":"a4b6df36c5d7ba29256269866d9c5a226a8074ee90cace04e032aeed07abcdd2"} Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.247534 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-36e0-account-create-4qxbj"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.248802 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-36e0-account-create-4qxbj" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.255557 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.289016 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.290142 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.295523 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.303543 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.316354 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.316479 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b92vw\" (UniqueName: \"kubernetes.io/projected/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-kube-api-access-b92vw\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.316581 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.316624 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-logs\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.316646 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-combined-ca-bundle\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.316778 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.316831 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-config-data\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.316885 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-scripts\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.316908 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-config\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.316990 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6mcl\" (UniqueName: \"kubernetes.io/projected/3fd00c24-58a0-4107-a811-6d67d3156f68-kube-api-access-n6mcl\") pod \"barbican-36e0-account-create-4qxbj\" (UID: \"3fd00c24-58a0-4107-a811-6d67d3156f68\") " pod="openstack/barbican-36e0-account-create-4qxbj" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.317041 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.317099 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkwgr\" (UniqueName: \"kubernetes.io/projected/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-kube-api-access-nkwgr\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.318725 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.319311 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-36e0-account-create-4qxbj"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.319596 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.322872 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-scripts\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.324261 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.324904 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-logs\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.325655 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.349292 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-combined-ca-bundle\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.354025 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkwgr\" (UniqueName: \"kubernetes.io/projected/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-kube-api-access-nkwgr\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.358262 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b92vw\" (UniqueName: \"kubernetes.io/projected/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-kube-api-access-b92vw\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.359267 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-config-data\") pod \"placement-db-sync-4tnj8\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.364337 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-config\") pod \"dnsmasq-dns-8b5c85b87-7fpxg\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.378175 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.378604 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.401990 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.423002 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69ff7954-464a-4c36-9996-772c43a2d602-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.423340 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.423566 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69ff7954-464a-4c36-9996-772c43a2d602-logs\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.423639 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.423699 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.423758 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6mcl\" (UniqueName: \"kubernetes.io/projected/3fd00c24-58a0-4107-a811-6d67d3156f68-kube-api-access-n6mcl\") pod \"barbican-36e0-account-create-4qxbj\" (UID: \"3fd00c24-58a0-4107-a811-6d67d3156f68\") " pod="openstack/barbican-36e0-account-create-4qxbj" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.423837 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.423934 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pns98\" (UniqueName: \"kubernetes.io/projected/69ff7954-464a-4c36-9996-772c43a2d602-kube-api-access-pns98\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.462978 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.474352 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6mcl\" (UniqueName: \"kubernetes.io/projected/3fd00c24-58a0-4107-a811-6d67d3156f68-kube-api-access-n6mcl\") pod \"barbican-36e0-account-create-4qxbj\" (UID: \"3fd00c24-58a0-4107-a811-6d67d3156f68\") " pod="openstack/barbican-36e0-account-create-4qxbj" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.506746 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b61f-account-create-pprts"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.516454 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b61f-account-create-pprts" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.519160 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.524903 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4tnj8" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.527053 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69ff7954-464a-4c36-9996-772c43a2d602-logs\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.527086 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.527281 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.527362 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.527504 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pns98\" (UniqueName: \"kubernetes.io/projected/69ff7954-464a-4c36-9996-772c43a2d602-kube-api-access-pns98\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.527535 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69ff7954-464a-4c36-9996-772c43a2d602-logs\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.527609 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69ff7954-464a-4c36-9996-772c43a2d602-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.527775 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.527792 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.532434 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69ff7954-464a-4c36-9996-772c43a2d602-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.532818 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.540972 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-config-data\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.544952 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-scripts\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.548363 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.551979 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pns98\" (UniqueName: \"kubernetes.io/projected/69ff7954-464a-4c36-9996-772c43a2d602-kube-api-access-pns98\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.595738 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b61f-account-create-pprts"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.605588 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-36e0-account-create-4qxbj" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.607810 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.635054 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfgrp\" (UniqueName: \"kubernetes.io/projected/cca10de2-6fab-417e-ac4a-ba3b73383432-kube-api-access-bfgrp\") pod \"cinder-b61f-account-create-pprts\" (UID: \"cca10de2-6fab-417e-ac4a-ba3b73383432\") " pod="openstack/cinder-b61f-account-create-pprts" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.662116 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-841e-account-create-vcldw"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.663774 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-841e-account-create-vcldw" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.667145 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.684263 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-841e-account-create-vcldw"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.687196 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.733911 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gggmh"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.739507 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwz6k\" (UniqueName: \"kubernetes.io/projected/a49e0065-af08-4379-b1e9-ac8998d2e98b-kube-api-access-kwz6k\") pod \"neutron-841e-account-create-vcldw\" (UID: \"a49e0065-af08-4379-b1e9-ac8998d2e98b\") " pod="openstack/neutron-841e-account-create-vcldw" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.739633 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfgrp\" (UniqueName: \"kubernetes.io/projected/cca10de2-6fab-417e-ac4a-ba3b73383432-kube-api-access-bfgrp\") pod \"cinder-b61f-account-create-pprts\" (UID: \"cca10de2-6fab-417e-ac4a-ba3b73383432\") " pod="openstack/cinder-b61f-account-create-pprts" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.760642 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfgrp\" (UniqueName: \"kubernetes.io/projected/cca10de2-6fab-417e-ac4a-ba3b73383432-kube-api-access-bfgrp\") pod \"cinder-b61f-account-create-pprts\" (UID: \"cca10de2-6fab-417e-ac4a-ba3b73383432\") " pod="openstack/cinder-b61f-account-create-pprts" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.826888 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b61f-account-create-pprts" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.840764 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwz6k\" (UniqueName: \"kubernetes.io/projected/a49e0065-af08-4379-b1e9-ac8998d2e98b-kube-api-access-kwz6k\") pod \"neutron-841e-account-create-vcldw\" (UID: \"a49e0065-af08-4379-b1e9-ac8998d2e98b\") " pod="openstack/neutron-841e-account-create-vcldw" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.854056 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.864250 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwz6k\" (UniqueName: \"kubernetes.io/projected/a49e0065-af08-4379-b1e9-ac8998d2e98b-kube-api-access-kwz6k\") pod \"neutron-841e-account-create-vcldw\" (UID: \"a49e0065-af08-4379-b1e9-ac8998d2e98b\") " pod="openstack/neutron-841e-account-create-vcldw" Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.922604 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-6dccg"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.941623 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-config\") pod \"95873bd3-7c5a-468c-8477-380a7ae6717f\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.951026 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-ovsdbserver-nb\") pod \"95873bd3-7c5a-468c-8477-380a7ae6717f\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.951078 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-dns-swift-storage-0\") pod \"95873bd3-7c5a-468c-8477-380a7ae6717f\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.951122 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-ovsdbserver-sb\") pod \"95873bd3-7c5a-468c-8477-380a7ae6717f\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.951258 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkjwq\" (UniqueName: \"kubernetes.io/projected/95873bd3-7c5a-468c-8477-380a7ae6717f-kube-api-access-hkjwq\") pod \"95873bd3-7c5a-468c-8477-380a7ae6717f\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.951306 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-dns-svc\") pod \"95873bd3-7c5a-468c-8477-380a7ae6717f\" (UID: \"95873bd3-7c5a-468c-8477-380a7ae6717f\") " Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.957450 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65d86fc4b5-cvwlg"] Oct 09 08:03:07 crc kubenswrapper[4715]: I1009 08:03:07.963699 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95873bd3-7c5a-468c-8477-380a7ae6717f-kube-api-access-hkjwq" (OuterVolumeSpecName: "kube-api-access-hkjwq") pod "95873bd3-7c5a-468c-8477-380a7ae6717f" (UID: "95873bd3-7c5a-468c-8477-380a7ae6717f"). InnerVolumeSpecName "kube-api-access-hkjwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.053892 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkjwq\" (UniqueName: \"kubernetes.io/projected/95873bd3-7c5a-468c-8477-380a7ae6717f-kube-api-access-hkjwq\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.084730 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "95873bd3-7c5a-468c-8477-380a7ae6717f" (UID: "95873bd3-7c5a-468c-8477-380a7ae6717f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.132064 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-config" (OuterVolumeSpecName: "config") pod "95873bd3-7c5a-468c-8477-380a7ae6717f" (UID: "95873bd3-7c5a-468c-8477-380a7ae6717f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.141316 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "95873bd3-7c5a-468c-8477-380a7ae6717f" (UID: "95873bd3-7c5a-468c-8477-380a7ae6717f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.143653 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-841e-account-create-vcldw" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.151369 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "95873bd3-7c5a-468c-8477-380a7ae6717f" (UID: "95873bd3-7c5a-468c-8477-380a7ae6717f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.154878 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "95873bd3-7c5a-468c-8477-380a7ae6717f" (UID: "95873bd3-7c5a-468c-8477-380a7ae6717f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.155945 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.155966 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.155975 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.155988 4715 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.155999 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95873bd3-7c5a-468c-8477-380a7ae6717f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.220261 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6bf4d6b49-qs2mn"] Oct 09 08:03:08 crc kubenswrapper[4715]: W1009 08:03:08.231678 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea235d88_0fcd_44e4_ae70_59936a38c6a8.slice/crio-97e4d201015f016e9d3639324e679b26e8c3adfb2023d2b957427d32ca1cee79 WatchSource:0}: Error finding container 97e4d201015f016e9d3639324e679b26e8c3adfb2023d2b957427d32ca1cee79: Status 404 returned error can't find the container with id 97e4d201015f016e9d3639324e679b26e8c3adfb2023d2b957427d32ca1cee79 Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.262671 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.270059 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" event={"ID":"9a0480a5-826d-4e8e-95dd-856c2b1b874a","Type":"ContainerStarted","Data":"dc398c3bf34f589f47b7fef5dae02d7168c489d96b14d6ca3b159df4331fe432"} Oct 09 08:03:08 crc kubenswrapper[4715]: W1009 08:03:08.273901 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cd73b9e_b01b_4148_8a2b_6bc9c89cf0ad.slice/crio-405c0a6fcd0c6ba5702e86c8bbb90ab12e824d86a7b1bf82d07dec80003ffbfd WatchSource:0}: Error finding container 405c0a6fcd0c6ba5702e86c8bbb90ab12e824d86a7b1bf82d07dec80003ffbfd: Status 404 returned error can't find the container with id 405c0a6fcd0c6ba5702e86c8bbb90ab12e824d86a7b1bf82d07dec80003ffbfd Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.296701 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gggmh" event={"ID":"2db20957-2340-4a65-ac69-ac1842a5a37a","Type":"ContainerStarted","Data":"bc66e4c5aa4986d1e1ffccb99f00df99f83757d6a826bda2dba1666f38deab31"} Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.300192 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bf4d6b49-qs2mn" event={"ID":"ea235d88-0fcd-44e4-ae70-59936a38c6a8","Type":"ContainerStarted","Data":"97e4d201015f016e9d3639324e679b26e8c3adfb2023d2b957427d32ca1cee79"} Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.302925 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65d86fc4b5-cvwlg" event={"ID":"2da43a5b-210e-4eeb-bba3-8ee808fbac44","Type":"ContainerStarted","Data":"009811246ab28228a46b59b6441e72fdaefb607befafd351bb1eb00311fc15c4"} Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.305020 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" event={"ID":"95873bd3-7c5a-468c-8477-380a7ae6717f","Type":"ContainerDied","Data":"a4b6df36c5d7ba29256269866d9c5a226a8074ee90cace04e032aeed07abcdd2"} Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.305075 4715 scope.go:117] "RemoveContainer" containerID="42eddf2e022660a034820866d4da8ecb926829de04edb3b2f88a37201954cc7d" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.305190 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-hj7hk" Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.374106 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-hj7hk"] Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.382376 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-hj7hk"] Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.542369 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4tnj8"] Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.681431 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b61f-account-create-pprts"] Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.698456 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-36e0-account-create-4qxbj"] Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.729311 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-7fpxg"] Oct 09 08:03:08 crc kubenswrapper[4715]: W1009 08:03:08.737531 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb56b8fd_4a84_44aa_a3c9_80aefa10784e.slice/crio-93ff2c1d1458261803c0df4e70c4bd9ef1e0686fccbc3f27478885f0b8401919 WatchSource:0}: Error finding container 93ff2c1d1458261803c0df4e70c4bd9ef1e0686fccbc3f27478885f0b8401919: Status 404 returned error can't find the container with id 93ff2c1d1458261803c0df4e70c4bd9ef1e0686fccbc3f27478885f0b8401919 Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.812358 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-841e-account-create-vcldw"] Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.855748 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:03:08 crc kubenswrapper[4715]: I1009 08:03:08.910285 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:03:08 crc kubenswrapper[4715]: W1009 08:03:08.915731 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69ff7954_464a_4c36_9996_772c43a2d602.slice/crio-d1539f8495501c548960ba504952d346454371b66ee25e05f1ca1b7c3478d68f WatchSource:0}: Error finding container d1539f8495501c548960ba504952d346454371b66ee25e05f1ca1b7c3478d68f: Status 404 returned error can't find the container with id d1539f8495501c548960ba504952d346454371b66ee25e05f1ca1b7c3478d68f Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.328065 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad","Type":"ContainerStarted","Data":"405c0a6fcd0c6ba5702e86c8bbb90ab12e824d86a7b1bf82d07dec80003ffbfd"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.331965 4715 generic.go:334] "Generic (PLEG): container finished" podID="a49e0065-af08-4379-b1e9-ac8998d2e98b" containerID="f7d0fb161ef47e63c7e2bbb5c808b7c1eea2f502998dab8cd2fc974db01f91dc" exitCode=0 Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.332051 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-841e-account-create-vcldw" event={"ID":"a49e0065-af08-4379-b1e9-ac8998d2e98b","Type":"ContainerDied","Data":"f7d0fb161ef47e63c7e2bbb5c808b7c1eea2f502998dab8cd2fc974db01f91dc"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.332085 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-841e-account-create-vcldw" event={"ID":"a49e0065-af08-4379-b1e9-ac8998d2e98b","Type":"ContainerStarted","Data":"21719a6ff9b895c8c809a283b3ea481895ce0d4d2ee20fad2b532b8d7230aa9c"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.335411 4715 generic.go:334] "Generic (PLEG): container finished" podID="cca10de2-6fab-417e-ac4a-ba3b73383432" containerID="239e1b00cdd05cc0d43cede7d8bc3a6bed56bedcc714f44c3b2f80b05cbc8cb4" exitCode=0 Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.335488 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b61f-account-create-pprts" event={"ID":"cca10de2-6fab-417e-ac4a-ba3b73383432","Type":"ContainerDied","Data":"239e1b00cdd05cc0d43cede7d8bc3a6bed56bedcc714f44c3b2f80b05cbc8cb4"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.335504 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b61f-account-create-pprts" event={"ID":"cca10de2-6fab-417e-ac4a-ba3b73383432","Type":"ContainerStarted","Data":"253a9368f6b8d521df4510d1d0469196199d7c3d8b586edd8d3a26bee499c38a"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.340081 4715 generic.go:334] "Generic (PLEG): container finished" podID="3fd00c24-58a0-4107-a811-6d67d3156f68" containerID="eebf4904b85681a1515e9c7904f9fc8fe93f2b1f2f3291223dc4c8637ea7c5c9" exitCode=0 Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.340401 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-36e0-account-create-4qxbj" event={"ID":"3fd00c24-58a0-4107-a811-6d67d3156f68","Type":"ContainerDied","Data":"eebf4904b85681a1515e9c7904f9fc8fe93f2b1f2f3291223dc4c8637ea7c5c9"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.340485 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-36e0-account-create-4qxbj" event={"ID":"3fd00c24-58a0-4107-a811-6d67d3156f68","Type":"ContainerStarted","Data":"a819a5cf47f97299618c55469adc400a47b20cb481c1a02d218178e4d41d5a5e"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.385365 4715 generic.go:334] "Generic (PLEG): container finished" podID="9a0480a5-826d-4e8e-95dd-856c2b1b874a" containerID="63134f676a56043a0ecc4c901db998d9a7a12112a5b2603bd89a8efce56def6a" exitCode=0 Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.385465 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" event={"ID":"9a0480a5-826d-4e8e-95dd-856c2b1b874a","Type":"ContainerDied","Data":"63134f676a56043a0ecc4c901db998d9a7a12112a5b2603bd89a8efce56def6a"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.389682 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69ff7954-464a-4c36-9996-772c43a2d602","Type":"ContainerStarted","Data":"d1539f8495501c548960ba504952d346454371b66ee25e05f1ca1b7c3478d68f"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.398653 4715 generic.go:334] "Generic (PLEG): container finished" podID="fb56b8fd-4a84-44aa-a3c9-80aefa10784e" containerID="89f7aa440f14f7dc7754a191a9b2beb81ba3c138a9fed3b12dd0b41025f51973" exitCode=0 Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.398788 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" event={"ID":"fb56b8fd-4a84-44aa-a3c9-80aefa10784e","Type":"ContainerDied","Data":"89f7aa440f14f7dc7754a191a9b2beb81ba3c138a9fed3b12dd0b41025f51973"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.398824 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" event={"ID":"fb56b8fd-4a84-44aa-a3c9-80aefa10784e","Type":"ContainerStarted","Data":"93ff2c1d1458261803c0df4e70c4bd9ef1e0686fccbc3f27478885f0b8401919"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.407939 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4tnj8" event={"ID":"80ba490d-aaff-4579-bc8a-ffaa4924c7b7","Type":"ContainerStarted","Data":"07c89b039feb8a8c7c145de8622119db7701e69b90a72e72f5e211fef56b57e6"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.412487 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gggmh" event={"ID":"2db20957-2340-4a65-ac69-ac1842a5a37a","Type":"ContainerStarted","Data":"352c27b9fcd0e70c6baf453f0729ce49aad3113cf7887641013079ce070f4d4b"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.422856 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"992626c1-d8c1-41f1-a0b2-60f72f8b182a","Type":"ContainerStarted","Data":"12568635d4fe7d0c9b5d9a4aaf3482615a7db7057ebbc75b0061305c1c3f3080"} Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.445730 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gggmh" podStartSLOduration=3.445708175 podStartE2EDuration="3.445708175s" podCreationTimestamp="2025-10-09 08:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:03:09.442187333 +0000 UTC m=+1020.134991341" watchObservedRunningTime="2025-10-09 08:03:09.445708175 +0000 UTC m=+1020.138512183" Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.844152 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.899029 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-config\") pod \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.899145 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-ovsdbserver-nb\") pod \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.899172 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrjb4\" (UniqueName: \"kubernetes.io/projected/9a0480a5-826d-4e8e-95dd-856c2b1b874a-kube-api-access-wrjb4\") pod \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.899192 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-ovsdbserver-sb\") pod \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.899222 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-dns-svc\") pod \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.899264 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-dns-swift-storage-0\") pod \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\" (UID: \"9a0480a5-826d-4e8e-95dd-856c2b1b874a\") " Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.911913 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a0480a5-826d-4e8e-95dd-856c2b1b874a-kube-api-access-wrjb4" (OuterVolumeSpecName: "kube-api-access-wrjb4") pod "9a0480a5-826d-4e8e-95dd-856c2b1b874a" (UID: "9a0480a5-826d-4e8e-95dd-856c2b1b874a"). InnerVolumeSpecName "kube-api-access-wrjb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.922283 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a0480a5-826d-4e8e-95dd-856c2b1b874a" (UID: "9a0480a5-826d-4e8e-95dd-856c2b1b874a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.933187 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-config" (OuterVolumeSpecName: "config") pod "9a0480a5-826d-4e8e-95dd-856c2b1b874a" (UID: "9a0480a5-826d-4e8e-95dd-856c2b1b874a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.952372 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a0480a5-826d-4e8e-95dd-856c2b1b874a" (UID: "9a0480a5-826d-4e8e-95dd-856c2b1b874a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.956600 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a0480a5-826d-4e8e-95dd-856c2b1b874a" (UID: "9a0480a5-826d-4e8e-95dd-856c2b1b874a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:09 crc kubenswrapper[4715]: I1009 08:03:09.956942 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9a0480a5-826d-4e8e-95dd-856c2b1b874a" (UID: "9a0480a5-826d-4e8e-95dd-856c2b1b874a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.001325 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrjb4\" (UniqueName: \"kubernetes.io/projected/9a0480a5-826d-4e8e-95dd-856c2b1b874a-kube-api-access-wrjb4\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.001363 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.001373 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.001382 4715 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.001392 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.001400 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a0480a5-826d-4e8e-95dd-856c2b1b874a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.161759 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95873bd3-7c5a-468c-8477-380a7ae6717f" path="/var/lib/kubelet/pods/95873bd3-7c5a-468c-8477-380a7ae6717f/volumes" Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.459028 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"992626c1-d8c1-41f1-a0b2-60f72f8b182a","Type":"ContainerStarted","Data":"6de129de283977ec932b22f2cb853130c125ae5397f5d8f19a4670eee77bd742"} Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.463466 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" event={"ID":"fb56b8fd-4a84-44aa-a3c9-80aefa10784e","Type":"ContainerStarted","Data":"de1bfe0badc20cbbb61faee7548dfd783c5d13dd7e2bbc2c662b7623ba5abc0f"} Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.464724 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.471268 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" event={"ID":"9a0480a5-826d-4e8e-95dd-856c2b1b874a","Type":"ContainerDied","Data":"dc398c3bf34f589f47b7fef5dae02d7168c489d96b14d6ca3b159df4331fe432"} Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.471326 4715 scope.go:117] "RemoveContainer" containerID="63134f676a56043a0ecc4c901db998d9a7a12112a5b2603bd89a8efce56def6a" Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.471507 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-6dccg" Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.477094 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69ff7954-464a-4c36-9996-772c43a2d602","Type":"ContainerStarted","Data":"2cb16cacd34b10edff21069039974fd961f207c7988fae6389f6f40229150eab"} Oct 09 08:03:10 crc kubenswrapper[4715]: I1009 08:03:10.487549 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" podStartSLOduration=3.487534299 podStartE2EDuration="3.487534299s" podCreationTimestamp="2025-10-09 08:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:03:10.480538715 +0000 UTC m=+1021.173342743" watchObservedRunningTime="2025-10-09 08:03:10.487534299 +0000 UTC m=+1021.180338307" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:10.649370 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-6dccg"] Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:10.657314 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-6dccg"] Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:10.932411 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b61f-account-create-pprts" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.133102 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfgrp\" (UniqueName: \"kubernetes.io/projected/cca10de2-6fab-417e-ac4a-ba3b73383432-kube-api-access-bfgrp\") pod \"cca10de2-6fab-417e-ac4a-ba3b73383432\" (UID: \"cca10de2-6fab-417e-ac4a-ba3b73383432\") " Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.153297 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca10de2-6fab-417e-ac4a-ba3b73383432-kube-api-access-bfgrp" (OuterVolumeSpecName: "kube-api-access-bfgrp") pod "cca10de2-6fab-417e-ac4a-ba3b73383432" (UID: "cca10de2-6fab-417e-ac4a-ba3b73383432"). InnerVolumeSpecName "kube-api-access-bfgrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.235496 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfgrp\" (UniqueName: \"kubernetes.io/projected/cca10de2-6fab-417e-ac4a-ba3b73383432-kube-api-access-bfgrp\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.488344 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69ff7954-464a-4c36-9996-772c43a2d602","Type":"ContainerStarted","Data":"35fb586ba7489bb3edb314f65a96e6e90bb701e99f8ffd1bfbf8e2928032e942"} Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.492199 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"992626c1-d8c1-41f1-a0b2-60f72f8b182a","Type":"ContainerStarted","Data":"61de926065528a18b541048bf6382e071f0be5d0e376d4319e83cb4458cbe456"} Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.499212 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b61f-account-create-pprts" event={"ID":"cca10de2-6fab-417e-ac4a-ba3b73383432","Type":"ContainerDied","Data":"253a9368f6b8d521df4510d1d0469196199d7c3d8b586edd8d3a26bee499c38a"} Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.499243 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="253a9368f6b8d521df4510d1d0469196199d7c3d8b586edd8d3a26bee499c38a" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.499290 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b61f-account-create-pprts" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.518208 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.518188197 podStartE2EDuration="4.518188197s" podCreationTimestamp="2025-10-09 08:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:03:11.510642097 +0000 UTC m=+1022.203446105" watchObservedRunningTime="2025-10-09 08:03:11.518188197 +0000 UTC m=+1022.210992195" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.535464 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.535443111 podStartE2EDuration="5.535443111s" podCreationTimestamp="2025-10-09 08:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:03:11.535218564 +0000 UTC m=+1022.228022572" watchObservedRunningTime="2025-10-09 08:03:11.535443111 +0000 UTC m=+1022.228247119" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.838719 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.874866 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-65d86fc4b5-cvwlg"] Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.889798 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.908826 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9f4678dcc-bh6m5"] Oct 09 08:03:11 crc kubenswrapper[4715]: E1009 08:03:11.909291 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95873bd3-7c5a-468c-8477-380a7ae6717f" containerName="init" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.909314 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="95873bd3-7c5a-468c-8477-380a7ae6717f" containerName="init" Oct 09 08:03:11 crc kubenswrapper[4715]: E1009 08:03:11.909331 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a0480a5-826d-4e8e-95dd-856c2b1b874a" containerName="init" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.909340 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0480a5-826d-4e8e-95dd-856c2b1b874a" containerName="init" Oct 09 08:03:11 crc kubenswrapper[4715]: E1009 08:03:11.909362 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca10de2-6fab-417e-ac4a-ba3b73383432" containerName="mariadb-account-create" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.909371 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca10de2-6fab-417e-ac4a-ba3b73383432" containerName="mariadb-account-create" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.909611 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca10de2-6fab-417e-ac4a-ba3b73383432" containerName="mariadb-account-create" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.909638 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a0480a5-826d-4e8e-95dd-856c2b1b874a" containerName="init" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.909656 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="95873bd3-7c5a-468c-8477-380a7ae6717f" containerName="init" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.910954 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.914754 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9f4678dcc-bh6m5"] Oct 09 08:03:11 crc kubenswrapper[4715]: I1009 08:03:11.950566 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.051760 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/347422fd-611a-4d6e-bccb-031c6f308b5f-scripts\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.052067 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/347422fd-611a-4d6e-bccb-031c6f308b5f-logs\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.052088 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/347422fd-611a-4d6e-bccb-031c6f308b5f-config-data\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.052265 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlhw2\" (UniqueName: \"kubernetes.io/projected/347422fd-611a-4d6e-bccb-031c6f308b5f-kube-api-access-nlhw2\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.052339 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/347422fd-611a-4d6e-bccb-031c6f308b5f-horizon-secret-key\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.152390 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a0480a5-826d-4e8e-95dd-856c2b1b874a" path="/var/lib/kubelet/pods/9a0480a5-826d-4e8e-95dd-856c2b1b874a/volumes" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.153708 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/347422fd-611a-4d6e-bccb-031c6f308b5f-scripts\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.153741 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/347422fd-611a-4d6e-bccb-031c6f308b5f-logs\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.153766 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/347422fd-611a-4d6e-bccb-031c6f308b5f-config-data\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.153795 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlhw2\" (UniqueName: \"kubernetes.io/projected/347422fd-611a-4d6e-bccb-031c6f308b5f-kube-api-access-nlhw2\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.153818 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/347422fd-611a-4d6e-bccb-031c6f308b5f-horizon-secret-key\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.154777 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/347422fd-611a-4d6e-bccb-031c6f308b5f-logs\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.155072 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/347422fd-611a-4d6e-bccb-031c6f308b5f-scripts\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.156695 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/347422fd-611a-4d6e-bccb-031c6f308b5f-config-data\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.171791 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/347422fd-611a-4d6e-bccb-031c6f308b5f-horizon-secret-key\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.185443 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlhw2\" (UniqueName: \"kubernetes.io/projected/347422fd-611a-4d6e-bccb-031c6f308b5f-kube-api-access-nlhw2\") pod \"horizon-9f4678dcc-bh6m5\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.232593 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.560220 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jgbfr"] Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.589332 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jgbfr"] Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.589715 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.595659 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.595838 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.597460 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-25xmn" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.673890 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-db-sync-config-data\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.673975 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-scripts\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.674027 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-etc-machine-id\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.674079 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-combined-ca-bundle\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.674176 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq68s\" (UniqueName: \"kubernetes.io/projected/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-kube-api-access-kq68s\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.674366 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-config-data\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.775337 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-config-data\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.775708 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-db-sync-config-data\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.775759 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-scripts\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.775794 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-etc-machine-id\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.775832 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-combined-ca-bundle\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.775885 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq68s\" (UniqueName: \"kubernetes.io/projected/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-kube-api-access-kq68s\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.775912 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-etc-machine-id\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.781191 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-scripts\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.783038 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-config-data\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.795250 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-db-sync-config-data\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.802761 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-combined-ca-bundle\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.805488 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq68s\" (UniqueName: \"kubernetes.io/projected/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-kube-api-access-kq68s\") pod \"cinder-db-sync-jgbfr\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:12 crc kubenswrapper[4715]: I1009 08:03:12.924145 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:03:13 crc kubenswrapper[4715]: I1009 08:03:13.532301 4715 generic.go:334] "Generic (PLEG): container finished" podID="2db20957-2340-4a65-ac69-ac1842a5a37a" containerID="352c27b9fcd0e70c6baf453f0729ce49aad3113cf7887641013079ce070f4d4b" exitCode=0 Oct 09 08:03:13 crc kubenswrapper[4715]: I1009 08:03:13.532405 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gggmh" event={"ID":"2db20957-2340-4a65-ac69-ac1842a5a37a","Type":"ContainerDied","Data":"352c27b9fcd0e70c6baf453f0729ce49aad3113cf7887641013079ce070f4d4b"} Oct 09 08:03:13 crc kubenswrapper[4715]: I1009 08:03:13.532566 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="69ff7954-464a-4c36-9996-772c43a2d602" containerName="glance-log" containerID="cri-o://2cb16cacd34b10edff21069039974fd961f207c7988fae6389f6f40229150eab" gracePeriod=30 Oct 09 08:03:13 crc kubenswrapper[4715]: I1009 08:03:13.532637 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="69ff7954-464a-4c36-9996-772c43a2d602" containerName="glance-httpd" containerID="cri-o://35fb586ba7489bb3edb314f65a96e6e90bb701e99f8ffd1bfbf8e2928032e942" gracePeriod=30 Oct 09 08:03:13 crc kubenswrapper[4715]: I1009 08:03:13.532661 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="992626c1-d8c1-41f1-a0b2-60f72f8b182a" containerName="glance-log" containerID="cri-o://6de129de283977ec932b22f2cb853130c125ae5397f5d8f19a4670eee77bd742" gracePeriod=30 Oct 09 08:03:13 crc kubenswrapper[4715]: I1009 08:03:13.532745 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="992626c1-d8c1-41f1-a0b2-60f72f8b182a" containerName="glance-httpd" containerID="cri-o://61de926065528a18b541048bf6382e071f0be5d0e376d4319e83cb4458cbe456" gracePeriod=30 Oct 09 08:03:14 crc kubenswrapper[4715]: I1009 08:03:14.558689 4715 generic.go:334] "Generic (PLEG): container finished" podID="992626c1-d8c1-41f1-a0b2-60f72f8b182a" containerID="61de926065528a18b541048bf6382e071f0be5d0e376d4319e83cb4458cbe456" exitCode=0 Oct 09 08:03:14 crc kubenswrapper[4715]: I1009 08:03:14.559048 4715 generic.go:334] "Generic (PLEG): container finished" podID="992626c1-d8c1-41f1-a0b2-60f72f8b182a" containerID="6de129de283977ec932b22f2cb853130c125ae5397f5d8f19a4670eee77bd742" exitCode=143 Oct 09 08:03:14 crc kubenswrapper[4715]: I1009 08:03:14.558747 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"992626c1-d8c1-41f1-a0b2-60f72f8b182a","Type":"ContainerDied","Data":"61de926065528a18b541048bf6382e071f0be5d0e376d4319e83cb4458cbe456"} Oct 09 08:03:14 crc kubenswrapper[4715]: I1009 08:03:14.559123 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"992626c1-d8c1-41f1-a0b2-60f72f8b182a","Type":"ContainerDied","Data":"6de129de283977ec932b22f2cb853130c125ae5397f5d8f19a4670eee77bd742"} Oct 09 08:03:14 crc kubenswrapper[4715]: I1009 08:03:14.561630 4715 generic.go:334] "Generic (PLEG): container finished" podID="69ff7954-464a-4c36-9996-772c43a2d602" containerID="35fb586ba7489bb3edb314f65a96e6e90bb701e99f8ffd1bfbf8e2928032e942" exitCode=0 Oct 09 08:03:14 crc kubenswrapper[4715]: I1009 08:03:14.561671 4715 generic.go:334] "Generic (PLEG): container finished" podID="69ff7954-464a-4c36-9996-772c43a2d602" containerID="2cb16cacd34b10edff21069039974fd961f207c7988fae6389f6f40229150eab" exitCode=143 Oct 09 08:03:14 crc kubenswrapper[4715]: I1009 08:03:14.561682 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69ff7954-464a-4c36-9996-772c43a2d602","Type":"ContainerDied","Data":"35fb586ba7489bb3edb314f65a96e6e90bb701e99f8ffd1bfbf8e2928032e942"} Oct 09 08:03:14 crc kubenswrapper[4715]: I1009 08:03:14.561713 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69ff7954-464a-4c36-9996-772c43a2d602","Type":"ContainerDied","Data":"2cb16cacd34b10edff21069039974fd961f207c7988fae6389f6f40229150eab"} Oct 09 08:03:16 crc kubenswrapper[4715]: I1009 08:03:16.765185 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:03:16 crc kubenswrapper[4715]: I1009 08:03:16.765512 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:03:16 crc kubenswrapper[4715]: I1009 08:03:16.765553 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 08:03:16 crc kubenswrapper[4715]: I1009 08:03:16.766480 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d09013bd08005ad32fff769feb782bd4aaed730f81b53c0815c16cfe4fba1a84"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 08:03:16 crc kubenswrapper[4715]: I1009 08:03:16.766530 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://d09013bd08005ad32fff769feb782bd4aaed730f81b53c0815c16cfe4fba1a84" gracePeriod=600 Oct 09 08:03:17 crc kubenswrapper[4715]: I1009 08:03:17.549615 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:03:17 crc kubenswrapper[4715]: I1009 08:03:17.596975 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="d09013bd08005ad32fff769feb782bd4aaed730f81b53c0815c16cfe4fba1a84" exitCode=0 Oct 09 08:03:17 crc kubenswrapper[4715]: I1009 08:03:17.597024 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"d09013bd08005ad32fff769feb782bd4aaed730f81b53c0815c16cfe4fba1a84"} Oct 09 08:03:17 crc kubenswrapper[4715]: I1009 08:03:17.597058 4715 scope.go:117] "RemoveContainer" containerID="451f2195b7f62aab0dad39fca7efb143fc4db9dbd4af35f5a099bbb88635e621" Oct 09 08:03:17 crc kubenswrapper[4715]: I1009 08:03:17.626224 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vgpch"] Oct 09 08:03:17 crc kubenswrapper[4715]: I1009 08:03:17.626503 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" podUID="5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" containerName="dnsmasq-dns" containerID="cri-o://4be01246678b2d91fb4ffa87373d054e72bc804c1823d277247e4d3cd3b75ecb" gracePeriod=10 Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.510049 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" podUID="5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.536973 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6bf4d6b49-qs2mn"] Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.571651 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d9885b95b-r2cb2"] Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.573291 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.575280 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.604240 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d9885b95b-r2cb2"] Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.617227 4715 generic.go:334] "Generic (PLEG): container finished" podID="5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" containerID="4be01246678b2d91fb4ffa87373d054e72bc804c1823d277247e4d3cd3b75ecb" exitCode=0 Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.617270 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" event={"ID":"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0","Type":"ContainerDied","Data":"4be01246678b2d91fb4ffa87373d054e72bc804c1823d277247e4d3cd3b75ecb"} Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.664917 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9f4678dcc-bh6m5"] Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.695516 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-d788d6d48-5nczq"] Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.701296 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.702285 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-horizon-secret-key\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.702357 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-combined-ca-bundle\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.702465 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-horizon-tls-certs\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.702497 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-logs\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.702517 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g5cn\" (UniqueName: \"kubernetes.io/projected/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-kube-api-access-8g5cn\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.702538 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-scripts\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.702569 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-config-data\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.731293 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d788d6d48-5nczq"] Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.804453 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-horizon-tls-certs\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.804519 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-logs\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.804551 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-logs\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.804578 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g5cn\" (UniqueName: \"kubernetes.io/projected/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-kube-api-access-8g5cn\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.804604 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-scripts\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.804973 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jlrc\" (UniqueName: \"kubernetes.io/projected/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-kube-api-access-5jlrc\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.805066 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-config-data\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.805106 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-config-data\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.805180 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-horizon-secret-key\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.805227 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-logs\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.805256 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-combined-ca-bundle\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.806160 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-horizon-tls-certs\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.806194 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-config-data\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.806230 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-scripts\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.805526 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-scripts\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.806305 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-combined-ca-bundle\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.806333 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-horizon-secret-key\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.812252 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-combined-ca-bundle\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.812444 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-horizon-secret-key\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.812584 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-horizon-tls-certs\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.824554 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g5cn\" (UniqueName: \"kubernetes.io/projected/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-kube-api-access-8g5cn\") pod \"horizon-5d9885b95b-r2cb2\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.896321 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.908268 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-logs\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.908345 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jlrc\" (UniqueName: \"kubernetes.io/projected/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-kube-api-access-5jlrc\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.908387 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-config-data\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.908503 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-horizon-tls-certs\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.908601 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-scripts\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.908640 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-combined-ca-bundle\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.908663 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-horizon-secret-key\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.909319 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-logs\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.910923 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-config-data\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.911956 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-scripts\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.912727 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-combined-ca-bundle\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.913283 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-horizon-secret-key\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.919389 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-horizon-tls-certs\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:18 crc kubenswrapper[4715]: I1009 08:03:18.933206 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jlrc\" (UniqueName: \"kubernetes.io/projected/7b8b0665-2ab8-4fb9-93ff-6405324f24d5-kube-api-access-5jlrc\") pod \"horizon-d788d6d48-5nczq\" (UID: \"7b8b0665-2ab8-4fb9-93ff-6405324f24d5\") " pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:19 crc kubenswrapper[4715]: I1009 08:03:19.025600 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:23 crc kubenswrapper[4715]: I1009 08:03:23.509668 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" podUID="5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Oct 09 08:03:27 crc kubenswrapper[4715]: E1009 08:03:27.916675 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 09 08:03:27 crc kubenswrapper[4715]: E1009 08:03:27.917870 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55fh5bfh559h59bh8ch596h5f7h596h5fch585h676h565hdch556hf7hdfh87h6h597h545h5bdh547h544h5fdh699h5f8h655h6bh5c9h669h675h7q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lf7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-65d86fc4b5-cvwlg_openstack(2da43a5b-210e-4eeb-bba3-8ee808fbac44): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:03:27 crc kubenswrapper[4715]: E1009 08:03:27.921550 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-65d86fc4b5-cvwlg" podUID="2da43a5b-210e-4eeb-bba3-8ee808fbac44" Oct 09 08:03:28 crc kubenswrapper[4715]: I1009 08:03:28.509292 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" podUID="5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Oct 09 08:03:28 crc kubenswrapper[4715]: I1009 08:03:28.509493 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:03:29 crc kubenswrapper[4715]: E1009 08:03:29.363583 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Oct 09 08:03:29 crc kubenswrapper[4715]: E1009 08:03:29.364000 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nkwgr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-4tnj8_openstack(80ba490d-aaff-4579-bc8a-ffaa4924c7b7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:03:29 crc kubenswrapper[4715]: E1009 08:03:29.365295 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-4tnj8" podUID="80ba490d-aaff-4579-bc8a-ffaa4924c7b7" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.509663 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-36e0-account-create-4qxbj" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.526970 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-841e-account-create-vcldw" Oct 09 08:03:29 crc kubenswrapper[4715]: E1009 08:03:29.533861 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 09 08:03:29 crc kubenswrapper[4715]: E1009 08:03:29.534005 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nddh64h87h5b9h85h677h667h585h668h698h698h548h66dh75h595hb7h57ch588hc9h59fh674hfbh79h58dh5f6h87hb6h56hf9h546h88h575q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kb47,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6bf4d6b49-qs2mn_openstack(ea235d88-0fcd-44e4-ae70-59936a38c6a8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:03:29 crc kubenswrapper[4715]: E1009 08:03:29.543664 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6bf4d6b49-qs2mn" podUID="ea235d88-0fcd-44e4-ae70-59936a38c6a8" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.547571 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.585494 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.606949 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2da43a5b-210e-4eeb-bba3-8ee808fbac44-scripts\") pod \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.606995 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lf7g\" (UniqueName: \"kubernetes.io/projected/2da43a5b-210e-4eeb-bba3-8ee808fbac44-kube-api-access-6lf7g\") pod \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.607038 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6mcl\" (UniqueName: \"kubernetes.io/projected/3fd00c24-58a0-4107-a811-6d67d3156f68-kube-api-access-n6mcl\") pod \"3fd00c24-58a0-4107-a811-6d67d3156f68\" (UID: \"3fd00c24-58a0-4107-a811-6d67d3156f68\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.607064 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-scripts\") pod \"2db20957-2340-4a65-ac69-ac1842a5a37a\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.607094 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2da43a5b-210e-4eeb-bba3-8ee808fbac44-horizon-secret-key\") pod \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.607126 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-credential-keys\") pod \"2db20957-2340-4a65-ac69-ac1842a5a37a\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.607152 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwz6k\" (UniqueName: \"kubernetes.io/projected/a49e0065-af08-4379-b1e9-ac8998d2e98b-kube-api-access-kwz6k\") pod \"a49e0065-af08-4379-b1e9-ac8998d2e98b\" (UID: \"a49e0065-af08-4379-b1e9-ac8998d2e98b\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.607207 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpht5\" (UniqueName: \"kubernetes.io/projected/2db20957-2340-4a65-ac69-ac1842a5a37a-kube-api-access-jpht5\") pod \"2db20957-2340-4a65-ac69-ac1842a5a37a\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.609416 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-fernet-keys\") pod \"2db20957-2340-4a65-ac69-ac1842a5a37a\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.609566 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da43a5b-210e-4eeb-bba3-8ee808fbac44-logs\") pod \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.609626 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-config-data\") pod \"2db20957-2340-4a65-ac69-ac1842a5a37a\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.609681 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2da43a5b-210e-4eeb-bba3-8ee808fbac44-config-data\") pod \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\" (UID: \"2da43a5b-210e-4eeb-bba3-8ee808fbac44\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.609739 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-combined-ca-bundle\") pod \"2db20957-2340-4a65-ac69-ac1842a5a37a\" (UID: \"2db20957-2340-4a65-ac69-ac1842a5a37a\") " Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.610256 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2da43a5b-210e-4eeb-bba3-8ee808fbac44-scripts" (OuterVolumeSpecName: "scripts") pod "2da43a5b-210e-4eeb-bba3-8ee808fbac44" (UID: "2da43a5b-210e-4eeb-bba3-8ee808fbac44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.610649 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2da43a5b-210e-4eeb-bba3-8ee808fbac44-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.616089 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2da43a5b-210e-4eeb-bba3-8ee808fbac44-logs" (OuterVolumeSpecName: "logs") pod "2da43a5b-210e-4eeb-bba3-8ee808fbac44" (UID: "2da43a5b-210e-4eeb-bba3-8ee808fbac44"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.616123 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2da43a5b-210e-4eeb-bba3-8ee808fbac44-config-data" (OuterVolumeSpecName: "config-data") pod "2da43a5b-210e-4eeb-bba3-8ee808fbac44" (UID: "2da43a5b-210e-4eeb-bba3-8ee808fbac44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.618621 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2db20957-2340-4a65-ac69-ac1842a5a37a" (UID: "2db20957-2340-4a65-ac69-ac1842a5a37a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.618838 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd00c24-58a0-4107-a811-6d67d3156f68-kube-api-access-n6mcl" (OuterVolumeSpecName: "kube-api-access-n6mcl") pod "3fd00c24-58a0-4107-a811-6d67d3156f68" (UID: "3fd00c24-58a0-4107-a811-6d67d3156f68"). InnerVolumeSpecName "kube-api-access-n6mcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.621198 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2db20957-2340-4a65-ac69-ac1842a5a37a" (UID: "2db20957-2340-4a65-ac69-ac1842a5a37a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.622241 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2da43a5b-210e-4eeb-bba3-8ee808fbac44-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2da43a5b-210e-4eeb-bba3-8ee808fbac44" (UID: "2da43a5b-210e-4eeb-bba3-8ee808fbac44"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.624873 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db20957-2340-4a65-ac69-ac1842a5a37a-kube-api-access-jpht5" (OuterVolumeSpecName: "kube-api-access-jpht5") pod "2db20957-2340-4a65-ac69-ac1842a5a37a" (UID: "2db20957-2340-4a65-ac69-ac1842a5a37a"). InnerVolumeSpecName "kube-api-access-jpht5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.625931 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2da43a5b-210e-4eeb-bba3-8ee808fbac44-kube-api-access-6lf7g" (OuterVolumeSpecName: "kube-api-access-6lf7g") pod "2da43a5b-210e-4eeb-bba3-8ee808fbac44" (UID: "2da43a5b-210e-4eeb-bba3-8ee808fbac44"). InnerVolumeSpecName "kube-api-access-6lf7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.643231 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-scripts" (OuterVolumeSpecName: "scripts") pod "2db20957-2340-4a65-ac69-ac1842a5a37a" (UID: "2db20957-2340-4a65-ac69-ac1842a5a37a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.646179 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a49e0065-af08-4379-b1e9-ac8998d2e98b-kube-api-access-kwz6k" (OuterVolumeSpecName: "kube-api-access-kwz6k") pod "a49e0065-af08-4379-b1e9-ac8998d2e98b" (UID: "a49e0065-af08-4379-b1e9-ac8998d2e98b"). InnerVolumeSpecName "kube-api-access-kwz6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.676089 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2db20957-2340-4a65-ac69-ac1842a5a37a" (UID: "2db20957-2340-4a65-ac69-ac1842a5a37a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.676183 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-config-data" (OuterVolumeSpecName: "config-data") pod "2db20957-2340-4a65-ac69-ac1842a5a37a" (UID: "2db20957-2340-4a65-ac69-ac1842a5a37a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.711285 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lf7g\" (UniqueName: \"kubernetes.io/projected/2da43a5b-210e-4eeb-bba3-8ee808fbac44-kube-api-access-6lf7g\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.711314 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6mcl\" (UniqueName: \"kubernetes.io/projected/3fd00c24-58a0-4107-a811-6d67d3156f68-kube-api-access-n6mcl\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.711323 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.711332 4715 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2da43a5b-210e-4eeb-bba3-8ee808fbac44-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.711340 4715 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.711348 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwz6k\" (UniqueName: \"kubernetes.io/projected/a49e0065-af08-4379-b1e9-ac8998d2e98b-kube-api-access-kwz6k\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.711356 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpht5\" (UniqueName: \"kubernetes.io/projected/2db20957-2340-4a65-ac69-ac1842a5a37a-kube-api-access-jpht5\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.711364 4715 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.711371 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da43a5b-210e-4eeb-bba3-8ee808fbac44-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.711379 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.711387 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2da43a5b-210e-4eeb-bba3-8ee808fbac44-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.711394 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db20957-2340-4a65-ac69-ac1842a5a37a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.723200 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65d86fc4b5-cvwlg" event={"ID":"2da43a5b-210e-4eeb-bba3-8ee808fbac44","Type":"ContainerDied","Data":"009811246ab28228a46b59b6441e72fdaefb607befafd351bb1eb00311fc15c4"} Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.723284 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65d86fc4b5-cvwlg" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.750175 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"d50fe5031fb9148bf6f3fa221f298d39e3a132b33b247b66e3d4fb59f3f0c771"} Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.768363 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-36e0-account-create-4qxbj" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.768398 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-36e0-account-create-4qxbj" event={"ID":"3fd00c24-58a0-4107-a811-6d67d3156f68","Type":"ContainerDied","Data":"a819a5cf47f97299618c55469adc400a47b20cb481c1a02d218178e4d41d5a5e"} Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.768880 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a819a5cf47f97299618c55469adc400a47b20cb481c1a02d218178e4d41d5a5e" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.780605 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gggmh" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.780653 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gggmh" event={"ID":"2db20957-2340-4a65-ac69-ac1842a5a37a","Type":"ContainerDied","Data":"bc66e4c5aa4986d1e1ffccb99f00df99f83757d6a826bda2dba1666f38deab31"} Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.780802 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc66e4c5aa4986d1e1ffccb99f00df99f83757d6a826bda2dba1666f38deab31" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.804736 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-65d86fc4b5-cvwlg"] Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.807839 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad","Type":"ContainerStarted","Data":"00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a"} Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.813004 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-841e-account-create-vcldw" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.813745 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-841e-account-create-vcldw" event={"ID":"a49e0065-af08-4379-b1e9-ac8998d2e98b","Type":"ContainerDied","Data":"21719a6ff9b895c8c809a283b3ea481895ce0d4d2ee20fad2b532b8d7230aa9c"} Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.813900 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21719a6ff9b895c8c809a283b3ea481895ce0d4d2ee20fad2b532b8d7230aa9c" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.814004 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-65d86fc4b5-cvwlg"] Oct 09 08:03:29 crc kubenswrapper[4715]: E1009 08:03:29.835715 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-4tnj8" podUID="80ba490d-aaff-4579-bc8a-ffaa4924c7b7" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.851098 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.969615 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d788d6d48-5nczq"] Oct 09 08:03:29 crc kubenswrapper[4715]: I1009 08:03:29.973696 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.015087 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ggh9\" (UniqueName: \"kubernetes.io/projected/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-kube-api-access-2ggh9\") pod \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.015155 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-dns-svc\") pod \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.015203 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-dns-swift-storage-0\") pod \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.015268 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-config\") pod \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.015335 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-ovsdbserver-sb\") pod \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.015556 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-ovsdbserver-nb\") pod \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\" (UID: \"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.015598 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69ff7954-464a-4c36-9996-772c43a2d602-httpd-run\") pod \"69ff7954-464a-4c36-9996-772c43a2d602\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.016292 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69ff7954-464a-4c36-9996-772c43a2d602-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "69ff7954-464a-4c36-9996-772c43a2d602" (UID: "69ff7954-464a-4c36-9996-772c43a2d602"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.021866 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-kube-api-access-2ggh9" (OuterVolumeSpecName: "kube-api-access-2ggh9") pod "5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" (UID: "5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0"). InnerVolumeSpecName "kube-api-access-2ggh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.086943 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" (UID: "5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.088285 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-config" (OuterVolumeSpecName: "config") pod "5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" (UID: "5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.088972 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" (UID: "5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.100550 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" (UID: "5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.112064 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" (UID: "5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.116698 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-config-data\") pod \"69ff7954-464a-4c36-9996-772c43a2d602\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.116803 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pns98\" (UniqueName: \"kubernetes.io/projected/69ff7954-464a-4c36-9996-772c43a2d602-kube-api-access-pns98\") pod \"69ff7954-464a-4c36-9996-772c43a2d602\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.116840 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"69ff7954-464a-4c36-9996-772c43a2d602\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.116905 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-combined-ca-bundle\") pod \"69ff7954-464a-4c36-9996-772c43a2d602\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.116964 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-scripts\") pod \"69ff7954-464a-4c36-9996-772c43a2d602\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.117015 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69ff7954-464a-4c36-9996-772c43a2d602-logs\") pod \"69ff7954-464a-4c36-9996-772c43a2d602\" (UID: \"69ff7954-464a-4c36-9996-772c43a2d602\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.117776 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.117801 4715 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/69ff7954-464a-4c36-9996-772c43a2d602-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.117815 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ggh9\" (UniqueName: \"kubernetes.io/projected/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-kube-api-access-2ggh9\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.117855 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.117868 4715 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.117880 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.117893 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.118298 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69ff7954-464a-4c36-9996-772c43a2d602-logs" (OuterVolumeSpecName: "logs") pod "69ff7954-464a-4c36-9996-772c43a2d602" (UID: "69ff7954-464a-4c36-9996-772c43a2d602"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.121343 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "69ff7954-464a-4c36-9996-772c43a2d602" (UID: "69ff7954-464a-4c36-9996-772c43a2d602"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.127454 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ff7954-464a-4c36-9996-772c43a2d602-kube-api-access-pns98" (OuterVolumeSpecName: "kube-api-access-pns98") pod "69ff7954-464a-4c36-9996-772c43a2d602" (UID: "69ff7954-464a-4c36-9996-772c43a2d602"). InnerVolumeSpecName "kube-api-access-pns98". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.131050 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-scripts" (OuterVolumeSpecName: "scripts") pod "69ff7954-464a-4c36-9996-772c43a2d602" (UID: "69ff7954-464a-4c36-9996-772c43a2d602"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.157243 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2da43a5b-210e-4eeb-bba3-8ee808fbac44" path="/var/lib/kubelet/pods/2da43a5b-210e-4eeb-bba3-8ee808fbac44/volumes" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.159795 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69ff7954-464a-4c36-9996-772c43a2d602" (UID: "69ff7954-464a-4c36-9996-772c43a2d602"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.168596 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9f4678dcc-bh6m5"] Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.197578 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d9885b95b-r2cb2"] Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.219912 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pns98\" (UniqueName: \"kubernetes.io/projected/69ff7954-464a-4c36-9996-772c43a2d602-kube-api-access-pns98\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.219948 4715 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.219960 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.219969 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.219980 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69ff7954-464a-4c36-9996-772c43a2d602-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.225527 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.230836 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-config-data" (OuterVolumeSpecName: "config-data") pod "69ff7954-464a-4c36-9996-772c43a2d602" (UID: "69ff7954-464a-4c36-9996-772c43a2d602"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.238593 4715 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.309884 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jgbfr"] Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.320488 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69ff7954-464a-4c36-9996-772c43a2d602-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.320511 4715 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.405565 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.426453 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqvz5\" (UniqueName: \"kubernetes.io/projected/992626c1-d8c1-41f1-a0b2-60f72f8b182a-kube-api-access-qqvz5\") pod \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.426528 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kb47\" (UniqueName: \"kubernetes.io/projected/ea235d88-0fcd-44e4-ae70-59936a38c6a8-kube-api-access-8kb47\") pod \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.426589 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-scripts\") pod \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.426673 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea235d88-0fcd-44e4-ae70-59936a38c6a8-config-data\") pod \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.426747 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea235d88-0fcd-44e4-ae70-59936a38c6a8-scripts\") pod \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.426802 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea235d88-0fcd-44e4-ae70-59936a38c6a8-horizon-secret-key\") pod \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.426836 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea235d88-0fcd-44e4-ae70-59936a38c6a8-logs\") pod \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\" (UID: \"ea235d88-0fcd-44e4-ae70-59936a38c6a8\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.426859 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-config-data\") pod \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.426909 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.427139 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/992626c1-d8c1-41f1-a0b2-60f72f8b182a-httpd-run\") pod \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.427196 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992626c1-d8c1-41f1-a0b2-60f72f8b182a-logs\") pod \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.427227 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-combined-ca-bundle\") pod \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\" (UID: \"992626c1-d8c1-41f1-a0b2-60f72f8b182a\") " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.427528 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea235d88-0fcd-44e4-ae70-59936a38c6a8-scripts" (OuterVolumeSpecName: "scripts") pod "ea235d88-0fcd-44e4-ae70-59936a38c6a8" (UID: "ea235d88-0fcd-44e4-ae70-59936a38c6a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.427772 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea235d88-0fcd-44e4-ae70-59936a38c6a8-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.429682 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/992626c1-d8c1-41f1-a0b2-60f72f8b182a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "992626c1-d8c1-41f1-a0b2-60f72f8b182a" (UID: "992626c1-d8c1-41f1-a0b2-60f72f8b182a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.429848 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea235d88-0fcd-44e4-ae70-59936a38c6a8-config-data" (OuterVolumeSpecName: "config-data") pod "ea235d88-0fcd-44e4-ae70-59936a38c6a8" (UID: "ea235d88-0fcd-44e4-ae70-59936a38c6a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.430808 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/992626c1-d8c1-41f1-a0b2-60f72f8b182a-logs" (OuterVolumeSpecName: "logs") pod "992626c1-d8c1-41f1-a0b2-60f72f8b182a" (UID: "992626c1-d8c1-41f1-a0b2-60f72f8b182a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.433052 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-scripts" (OuterVolumeSpecName: "scripts") pod "992626c1-d8c1-41f1-a0b2-60f72f8b182a" (UID: "992626c1-d8c1-41f1-a0b2-60f72f8b182a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.433603 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea235d88-0fcd-44e4-ae70-59936a38c6a8-logs" (OuterVolumeSpecName: "logs") pod "ea235d88-0fcd-44e4-ae70-59936a38c6a8" (UID: "ea235d88-0fcd-44e4-ae70-59936a38c6a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.434845 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea235d88-0fcd-44e4-ae70-59936a38c6a8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ea235d88-0fcd-44e4-ae70-59936a38c6a8" (UID: "ea235d88-0fcd-44e4-ae70-59936a38c6a8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.436028 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/992626c1-d8c1-41f1-a0b2-60f72f8b182a-kube-api-access-qqvz5" (OuterVolumeSpecName: "kube-api-access-qqvz5") pod "992626c1-d8c1-41f1-a0b2-60f72f8b182a" (UID: "992626c1-d8c1-41f1-a0b2-60f72f8b182a"). InnerVolumeSpecName "kube-api-access-qqvz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.437359 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea235d88-0fcd-44e4-ae70-59936a38c6a8-kube-api-access-8kb47" (OuterVolumeSpecName: "kube-api-access-8kb47") pod "ea235d88-0fcd-44e4-ae70-59936a38c6a8" (UID: "ea235d88-0fcd-44e4-ae70-59936a38c6a8"). InnerVolumeSpecName "kube-api-access-8kb47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.445040 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "992626c1-d8c1-41f1-a0b2-60f72f8b182a" (UID: "992626c1-d8c1-41f1-a0b2-60f72f8b182a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.454976 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "992626c1-d8c1-41f1-a0b2-60f72f8b182a" (UID: "992626c1-d8c1-41f1-a0b2-60f72f8b182a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.493219 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-config-data" (OuterVolumeSpecName: "config-data") pod "992626c1-d8c1-41f1-a0b2-60f72f8b182a" (UID: "992626c1-d8c1-41f1-a0b2-60f72f8b182a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.538793 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqvz5\" (UniqueName: \"kubernetes.io/projected/992626c1-d8c1-41f1-a0b2-60f72f8b182a-kube-api-access-qqvz5\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.538840 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kb47\" (UniqueName: \"kubernetes.io/projected/ea235d88-0fcd-44e4-ae70-59936a38c6a8-kube-api-access-8kb47\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.538853 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.538864 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea235d88-0fcd-44e4-ae70-59936a38c6a8-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.538876 4715 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ea235d88-0fcd-44e4-ae70-59936a38c6a8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.538887 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea235d88-0fcd-44e4-ae70-59936a38c6a8-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.538898 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.538939 4715 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.538951 4715 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/992626c1-d8c1-41f1-a0b2-60f72f8b182a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.538962 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/992626c1-d8c1-41f1-a0b2-60f72f8b182a-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.538974 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992626c1-d8c1-41f1-a0b2-60f72f8b182a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.560907 4715 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.640393 4715 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.640545 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gggmh"] Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.646066 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gggmh"] Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.737184 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nf22z"] Oct 09 08:03:30 crc kubenswrapper[4715]: E1009 08:03:30.737676 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ff7954-464a-4c36-9996-772c43a2d602" containerName="glance-log" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.737702 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ff7954-464a-4c36-9996-772c43a2d602" containerName="glance-log" Oct 09 08:03:30 crc kubenswrapper[4715]: E1009 08:03:30.737723 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992626c1-d8c1-41f1-a0b2-60f72f8b182a" containerName="glance-httpd" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.737732 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="992626c1-d8c1-41f1-a0b2-60f72f8b182a" containerName="glance-httpd" Oct 09 08:03:30 crc kubenswrapper[4715]: E1009 08:03:30.737750 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd00c24-58a0-4107-a811-6d67d3156f68" containerName="mariadb-account-create" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.737758 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd00c24-58a0-4107-a811-6d67d3156f68" containerName="mariadb-account-create" Oct 09 08:03:30 crc kubenswrapper[4715]: E1009 08:03:30.737772 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49e0065-af08-4379-b1e9-ac8998d2e98b" containerName="mariadb-account-create" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.737779 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49e0065-af08-4379-b1e9-ac8998d2e98b" containerName="mariadb-account-create" Oct 09 08:03:30 crc kubenswrapper[4715]: E1009 08:03:30.737793 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db20957-2340-4a65-ac69-ac1842a5a37a" containerName="keystone-bootstrap" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.737800 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db20957-2340-4a65-ac69-ac1842a5a37a" containerName="keystone-bootstrap" Oct 09 08:03:30 crc kubenswrapper[4715]: E1009 08:03:30.737818 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" containerName="dnsmasq-dns" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.737825 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" containerName="dnsmasq-dns" Oct 09 08:03:30 crc kubenswrapper[4715]: E1009 08:03:30.737837 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992626c1-d8c1-41f1-a0b2-60f72f8b182a" containerName="glance-log" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.737847 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="992626c1-d8c1-41f1-a0b2-60f72f8b182a" containerName="glance-log" Oct 09 08:03:30 crc kubenswrapper[4715]: E1009 08:03:30.737863 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" containerName="init" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.737870 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" containerName="init" Oct 09 08:03:30 crc kubenswrapper[4715]: E1009 08:03:30.737886 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ff7954-464a-4c36-9996-772c43a2d602" containerName="glance-httpd" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.737893 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ff7954-464a-4c36-9996-772c43a2d602" containerName="glance-httpd" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.738088 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd00c24-58a0-4107-a811-6d67d3156f68" containerName="mariadb-account-create" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.738102 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="992626c1-d8c1-41f1-a0b2-60f72f8b182a" containerName="glance-httpd" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.738112 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ff7954-464a-4c36-9996-772c43a2d602" containerName="glance-httpd" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.738125 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="992626c1-d8c1-41f1-a0b2-60f72f8b182a" containerName="glance-log" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.738137 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49e0065-af08-4379-b1e9-ac8998d2e98b" containerName="mariadb-account-create" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.738148 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ff7954-464a-4c36-9996-772c43a2d602" containerName="glance-log" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.738158 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db20957-2340-4a65-ac69-ac1842a5a37a" containerName="keystone-bootstrap" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.738166 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" containerName="dnsmasq-dns" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.738751 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.741550 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.741851 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.741961 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ht6l7" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.742559 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.745746 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nf22z"] Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.827991 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"69ff7954-464a-4c36-9996-772c43a2d602","Type":"ContainerDied","Data":"d1539f8495501c548960ba504952d346454371b66ee25e05f1ca1b7c3478d68f"} Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.827999 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.828056 4715 scope.go:117] "RemoveContainer" containerID="35fb586ba7489bb3edb314f65a96e6e90bb701e99f8ffd1bfbf8e2928032e942" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.830822 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"992626c1-d8c1-41f1-a0b2-60f72f8b182a","Type":"ContainerDied","Data":"12568635d4fe7d0c9b5d9a4aaf3482615a7db7057ebbc75b0061305c1c3f3080"} Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.830908 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.833352 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6bf4d6b49-qs2mn" event={"ID":"ea235d88-0fcd-44e4-ae70-59936a38c6a8","Type":"ContainerDied","Data":"97e4d201015f016e9d3639324e679b26e8c3adfb2023d2b957427d32ca1cee79"} Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.833482 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6bf4d6b49-qs2mn" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.837300 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" event={"ID":"5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0","Type":"ContainerDied","Data":"ee0a1678cdc411dee04a7c7b19005c464585333d12e9d74a2604de2d7cc7b626"} Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.837374 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-vgpch" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.839008 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d9885b95b-r2cb2" event={"ID":"ada9982a-fc5f-4c93-bfa3-3401c0824c2e","Type":"ContainerStarted","Data":"5eb867aac698accf445bcda1555a89eb888f82c2528dd5180cd9c22862dd5490"} Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.841628 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jgbfr" event={"ID":"2f4dee6e-f935-4bdd-9138-d414e86c0fa2","Type":"ContainerStarted","Data":"16abcda551342f0e235ceaecbc302dfc8fc86dde87a0a6396a526a99c2ccb764"} Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.842965 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-combined-ca-bundle\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.843022 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-config-data\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.843078 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-fernet-keys\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.843128 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-credential-keys\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.843205 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-scripts\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.843259 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw5hf\" (UniqueName: \"kubernetes.io/projected/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-kube-api-access-jw5hf\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.844174 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d788d6d48-5nczq" event={"ID":"7b8b0665-2ab8-4fb9-93ff-6405324f24d5","Type":"ContainerStarted","Data":"ee5a398bbb2977853258e119c48753f526b76f007dfac34edc6ac757be3427c2"} Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.846789 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f4678dcc-bh6m5" event={"ID":"347422fd-611a-4d6e-bccb-031c6f308b5f","Type":"ContainerStarted","Data":"515932b11c9ed1c5d9b48a75f3c8e0059ba021512a4d817938ce824b32d16656"} Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.869157 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vgpch"] Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.904591 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vgpch"] Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.918615 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.944620 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw5hf\" (UniqueName: \"kubernetes.io/projected/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-kube-api-access-jw5hf\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.944875 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-combined-ca-bundle\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.944973 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-config-data\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.945106 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-fernet-keys\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.945224 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-credential-keys\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.945439 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-scripts\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.944975 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.952005 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-fernet-keys\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.953724 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-scripts\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.963893 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw5hf\" (UniqueName: \"kubernetes.io/projected/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-kube-api-access-jw5hf\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.965689 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-config-data\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.966338 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-combined-ca-bundle\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.972176 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-credential-keys\") pod \"keystone-bootstrap-nf22z\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.972482 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:03:30 crc kubenswrapper[4715]: I1009 08:03:30.990981 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.007615 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.009248 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.011051 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.011290 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.011490 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sgxzl" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.011614 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.033825 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6bf4d6b49-qs2mn"] Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.040929 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.047285 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-logs\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.047343 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.047381 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.047507 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.047676 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv6dv\" (UniqueName: \"kubernetes.io/projected/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-kube-api-access-qv6dv\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.047746 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.047771 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.047788 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.048057 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6bf4d6b49-qs2mn"] Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.054635 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.056086 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.058439 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.058861 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.062161 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.064365 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.134631 4715 scope.go:117] "RemoveContainer" containerID="2cb16cacd34b10edff21069039974fd961f207c7988fae6389f6f40229150eab" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.149028 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-logs\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.149349 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.149405 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-scripts\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.149505 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.149530 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.149707 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-logs\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.149776 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.149856 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-config-data\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.149925 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.150100 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q2hl\" (UniqueName: \"kubernetes.io/projected/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-kube-api-access-7q2hl\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.150188 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.150195 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.150222 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.150351 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv6dv\" (UniqueName: \"kubernetes.io/projected/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-kube-api-access-qv6dv\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.150460 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.150627 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.150707 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.150741 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.152069 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-logs\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.163154 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.163560 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.165561 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.167177 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.170210 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv6dv\" (UniqueName: \"kubernetes.io/projected/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-kube-api-access-qv6dv\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.225596 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.257597 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-scripts\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.257636 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.257668 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-logs\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.257697 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-config-data\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.258173 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-logs\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.258321 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q2hl\" (UniqueName: \"kubernetes.io/projected/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-kube-api-access-7q2hl\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.258359 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.258403 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.258688 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.258805 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.259003 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.263297 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.264481 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-config-data\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.265048 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.269170 4715 scope.go:117] "RemoveContainer" containerID="61de926065528a18b541048bf6382e071f0be5d0e376d4319e83cb4458cbe456" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.273286 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-scripts\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.277378 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q2hl\" (UniqueName: \"kubernetes.io/projected/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-kube-api-access-7q2hl\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.305591 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.331905 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.374130 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.495226 4715 scope.go:117] "RemoveContainer" containerID="6de129de283977ec932b22f2cb853130c125ae5397f5d8f19a4670eee77bd742" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.576555 4715 scope.go:117] "RemoveContainer" containerID="4be01246678b2d91fb4ffa87373d054e72bc804c1823d277247e4d3cd3b75ecb" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.629477 4715 scope.go:117] "RemoveContainer" containerID="d4b8a4b383afc3fe244881002dee2297aa56aa3f7f53da26f351a431f4e1a452" Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.740099 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nf22z"] Oct 09 08:03:31 crc kubenswrapper[4715]: W1009 08:03:31.772872 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod189bfb70_4185_415b_a3d9_5d0ed1a76cb0.slice/crio-ec7154bdb5d0a26f771789462a79931958f89d189b80796b1d86042e00a9c196 WatchSource:0}: Error finding container ec7154bdb5d0a26f771789462a79931958f89d189b80796b1d86042e00a9c196: Status 404 returned error can't find the container with id ec7154bdb5d0a26f771789462a79931958f89d189b80796b1d86042e00a9c196 Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.868791 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad","Type":"ContainerStarted","Data":"afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958"} Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.881737 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d9885b95b-r2cb2" event={"ID":"ada9982a-fc5f-4c93-bfa3-3401c0824c2e","Type":"ContainerStarted","Data":"30df9f4b9017305ddf0ba538bc4401ae44d2c84af182eb159fea1745dd9773d7"} Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.885829 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nf22z" event={"ID":"189bfb70-4185-415b-a3d9-5d0ed1a76cb0","Type":"ContainerStarted","Data":"ec7154bdb5d0a26f771789462a79931958f89d189b80796b1d86042e00a9c196"} Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.890407 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d788d6d48-5nczq" event={"ID":"7b8b0665-2ab8-4fb9-93ff-6405324f24d5","Type":"ContainerStarted","Data":"76130a7c6c78936315bef668e86af2f8a604eff29722046fcd94d1fd424cc6d7"} Oct 09 08:03:31 crc kubenswrapper[4715]: I1009 08:03:31.899175 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f4678dcc-bh6m5" event={"ID":"347422fd-611a-4d6e-bccb-031c6f308b5f","Type":"ContainerStarted","Data":"ddab14ccb5f324df95c91360731e39410a5dd9dbe1153aa8f7dc965ce6c09a86"} Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.025141 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:03:32 crc kubenswrapper[4715]: W1009 08:03:32.026229 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa8c6cfe_dae5_455a_a3f4_83608f3064b5.slice/crio-baf39a9f3334dc00fb6af8d58712a5b108840a8b359d9a6ba9ceedba94d50dba WatchSource:0}: Error finding container baf39a9f3334dc00fb6af8d58712a5b108840a8b359d9a6ba9ceedba94d50dba: Status 404 returned error can't find the container with id baf39a9f3334dc00fb6af8d58712a5b108840a8b359d9a6ba9ceedba94d50dba Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.151090 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db20957-2340-4a65-ac69-ac1842a5a37a" path="/var/lib/kubelet/pods/2db20957-2340-4a65-ac69-ac1842a5a37a/volumes" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.151889 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0" path="/var/lib/kubelet/pods/5771ea8b-32e9-4e3f-a2e8-a4dcec9d4bc0/volumes" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.152668 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69ff7954-464a-4c36-9996-772c43a2d602" path="/var/lib/kubelet/pods/69ff7954-464a-4c36-9996-772c43a2d602/volumes" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.153948 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="992626c1-d8c1-41f1-a0b2-60f72f8b182a" path="/var/lib/kubelet/pods/992626c1-d8c1-41f1-a0b2-60f72f8b182a/volumes" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.154708 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea235d88-0fcd-44e4-ae70-59936a38c6a8" path="/var/lib/kubelet/pods/ea235d88-0fcd-44e4-ae70-59936a38c6a8/volumes" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.155131 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.516846 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-l7r5w"] Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.518489 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.524223 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ffhx7" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.524576 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.527379 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-l7r5w"] Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.586128 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/027df64c-f87d-401f-965c-88c874a854f8-combined-ca-bundle\") pod \"barbican-db-sync-l7r5w\" (UID: \"027df64c-f87d-401f-965c-88c874a854f8\") " pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.586190 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcjd4\" (UniqueName: \"kubernetes.io/projected/027df64c-f87d-401f-965c-88c874a854f8-kube-api-access-kcjd4\") pod \"barbican-db-sync-l7r5w\" (UID: \"027df64c-f87d-401f-965c-88c874a854f8\") " pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.586345 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/027df64c-f87d-401f-965c-88c874a854f8-db-sync-config-data\") pod \"barbican-db-sync-l7r5w\" (UID: \"027df64c-f87d-401f-965c-88c874a854f8\") " pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.687572 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/027df64c-f87d-401f-965c-88c874a854f8-db-sync-config-data\") pod \"barbican-db-sync-l7r5w\" (UID: \"027df64c-f87d-401f-965c-88c874a854f8\") " pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.687627 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/027df64c-f87d-401f-965c-88c874a854f8-combined-ca-bundle\") pod \"barbican-db-sync-l7r5w\" (UID: \"027df64c-f87d-401f-965c-88c874a854f8\") " pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.687657 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcjd4\" (UniqueName: \"kubernetes.io/projected/027df64c-f87d-401f-965c-88c874a854f8-kube-api-access-kcjd4\") pod \"barbican-db-sync-l7r5w\" (UID: \"027df64c-f87d-401f-965c-88c874a854f8\") " pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.703767 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/027df64c-f87d-401f-965c-88c874a854f8-db-sync-config-data\") pod \"barbican-db-sync-l7r5w\" (UID: \"027df64c-f87d-401f-965c-88c874a854f8\") " pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.705062 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/027df64c-f87d-401f-965c-88c874a854f8-combined-ca-bundle\") pod \"barbican-db-sync-l7r5w\" (UID: \"027df64c-f87d-401f-965c-88c874a854f8\") " pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.709809 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcjd4\" (UniqueName: \"kubernetes.io/projected/027df64c-f87d-401f-965c-88c874a854f8-kube-api-access-kcjd4\") pod \"barbican-db-sync-l7r5w\" (UID: \"027df64c-f87d-401f-965c-88c874a854f8\") " pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.851268 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.852043 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-cf8st"] Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.853112 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cf8st" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.857664 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wfq5s" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.857800 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.857907 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.880449 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cf8st"] Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.891627 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4e65240-0972-4024-a140-425dda8cfa12-config\") pod \"neutron-db-sync-cf8st\" (UID: \"f4e65240-0972-4024-a140-425dda8cfa12\") " pod="openstack/neutron-db-sync-cf8st" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.891694 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e65240-0972-4024-a140-425dda8cfa12-combined-ca-bundle\") pod \"neutron-db-sync-cf8st\" (UID: \"f4e65240-0972-4024-a140-425dda8cfa12\") " pod="openstack/neutron-db-sync-cf8st" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.891740 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9vxh\" (UniqueName: \"kubernetes.io/projected/f4e65240-0972-4024-a140-425dda8cfa12-kube-api-access-c9vxh\") pod \"neutron-db-sync-cf8st\" (UID: \"f4e65240-0972-4024-a140-425dda8cfa12\") " pod="openstack/neutron-db-sync-cf8st" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.940968 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d9885b95b-r2cb2" event={"ID":"ada9982a-fc5f-4c93-bfa3-3401c0824c2e","Type":"ContainerStarted","Data":"a38b8e8bdafd561f301d7de3849df03039ec334d2e9911426537e38b58e93a3e"} Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.949739 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nf22z" event={"ID":"189bfb70-4185-415b-a3d9-5d0ed1a76cb0","Type":"ContainerStarted","Data":"5da445758a62da0974a02ce3720b58b9a61359d0cec40acebf20e3f8fac21a3f"} Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.953702 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fa8c6cfe-dae5-455a-a3f4-83608f3064b5","Type":"ContainerStarted","Data":"29c69c6bfea7f324ed2cb89b3d6fdc974a76c48f88d4ce8852af81d57091e21f"} Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.953753 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fa8c6cfe-dae5-455a-a3f4-83608f3064b5","Type":"ContainerStarted","Data":"baf39a9f3334dc00fb6af8d58712a5b108840a8b359d9a6ba9ceedba94d50dba"} Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.957535 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d788d6d48-5nczq" event={"ID":"7b8b0665-2ab8-4fb9-93ff-6405324f24d5","Type":"ContainerStarted","Data":"0d0c26571764fb6ee3f502d10d516c92098abd9dcfea2db48f1fd11475281e15"} Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.969781 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5d9885b95b-r2cb2" podStartSLOduration=13.841275584 podStartE2EDuration="14.969764206s" podCreationTimestamp="2025-10-09 08:03:18 +0000 UTC" firstStartedPulling="2025-10-09 08:03:30.159064534 +0000 UTC m=+1040.851868552" lastFinishedPulling="2025-10-09 08:03:31.287553166 +0000 UTC m=+1041.980357174" observedRunningTime="2025-10-09 08:03:32.965676627 +0000 UTC m=+1043.658480635" watchObservedRunningTime="2025-10-09 08:03:32.969764206 +0000 UTC m=+1043.662568214" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.995728 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9vxh\" (UniqueName: \"kubernetes.io/projected/f4e65240-0972-4024-a140-425dda8cfa12-kube-api-access-c9vxh\") pod \"neutron-db-sync-cf8st\" (UID: \"f4e65240-0972-4024-a140-425dda8cfa12\") " pod="openstack/neutron-db-sync-cf8st" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.995966 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4e65240-0972-4024-a140-425dda8cfa12-config\") pod \"neutron-db-sync-cf8st\" (UID: \"f4e65240-0972-4024-a140-425dda8cfa12\") " pod="openstack/neutron-db-sync-cf8st" Oct 09 08:03:32 crc kubenswrapper[4715]: I1009 08:03:32.996288 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e65240-0972-4024-a140-425dda8cfa12-combined-ca-bundle\") pod \"neutron-db-sync-cf8st\" (UID: \"f4e65240-0972-4024-a140-425dda8cfa12\") " pod="openstack/neutron-db-sync-cf8st" Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.009063 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4e65240-0972-4024-a140-425dda8cfa12-config\") pod \"neutron-db-sync-cf8st\" (UID: \"f4e65240-0972-4024-a140-425dda8cfa12\") " pod="openstack/neutron-db-sync-cf8st" Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.013930 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e65240-0972-4024-a140-425dda8cfa12-combined-ca-bundle\") pod \"neutron-db-sync-cf8st\" (UID: \"f4e65240-0972-4024-a140-425dda8cfa12\") " pod="openstack/neutron-db-sync-cf8st" Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.016683 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f4678dcc-bh6m5" event={"ID":"347422fd-611a-4d6e-bccb-031c6f308b5f","Type":"ContainerStarted","Data":"d060216fc887a09febcae2fc6caf87f7b8c0148fb07de6cbaf52d75ba0cb9504"} Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.016880 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9f4678dcc-bh6m5" podUID="347422fd-611a-4d6e-bccb-031c6f308b5f" containerName="horizon-log" containerID="cri-o://ddab14ccb5f324df95c91360731e39410a5dd9dbe1153aa8f7dc965ce6c09a86" gracePeriod=30 Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.017150 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9f4678dcc-bh6m5" podUID="347422fd-611a-4d6e-bccb-031c6f308b5f" containerName="horizon" containerID="cri-o://d060216fc887a09febcae2fc6caf87f7b8c0148fb07de6cbaf52d75ba0cb9504" gracePeriod=30 Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.022854 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9vxh\" (UniqueName: \"kubernetes.io/projected/f4e65240-0972-4024-a140-425dda8cfa12-kube-api-access-c9vxh\") pod \"neutron-db-sync-cf8st\" (UID: \"f4e65240-0972-4024-a140-425dda8cfa12\") " pod="openstack/neutron-db-sync-cf8st" Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.028535 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1","Type":"ContainerStarted","Data":"cc8a7e84d62e1dce8a72340a833b10ee0a2911df43f37ffeef86f9b28eb53970"} Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.044903 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-d788d6d48-5nczq" podStartSLOduration=13.931547197 podStartE2EDuration="15.044884787s" podCreationTimestamp="2025-10-09 08:03:18 +0000 UTC" firstStartedPulling="2025-10-09 08:03:29.990270072 +0000 UTC m=+1040.683074080" lastFinishedPulling="2025-10-09 08:03:31.103607662 +0000 UTC m=+1041.796411670" observedRunningTime="2025-10-09 08:03:33.010746871 +0000 UTC m=+1043.703550879" watchObservedRunningTime="2025-10-09 08:03:33.044884787 +0000 UTC m=+1043.737688795" Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.094731 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nf22z" podStartSLOduration=3.09470516 podStartE2EDuration="3.09470516s" podCreationTimestamp="2025-10-09 08:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:03:33.044245548 +0000 UTC m=+1043.737049556" watchObservedRunningTime="2025-10-09 08:03:33.09470516 +0000 UTC m=+1043.787509168" Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.184106 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cf8st" Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.449285 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9f4678dcc-bh6m5" podStartSLOduration=21.332649736 podStartE2EDuration="22.449264761s" podCreationTimestamp="2025-10-09 08:03:11 +0000 UTC" firstStartedPulling="2025-10-09 08:03:30.148858297 +0000 UTC m=+1040.841662305" lastFinishedPulling="2025-10-09 08:03:31.265473322 +0000 UTC m=+1041.958277330" observedRunningTime="2025-10-09 08:03:33.110323926 +0000 UTC m=+1043.803127954" watchObservedRunningTime="2025-10-09 08:03:33.449264761 +0000 UTC m=+1044.142068769" Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.455627 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-l7r5w"] Oct 09 08:03:33 crc kubenswrapper[4715]: W1009 08:03:33.470123 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod027df64c_f87d_401f_965c_88c874a854f8.slice/crio-a98fe117ed6bbf1e633017e7b9dfff0e2865365b0d98c8774243153238fc8771 WatchSource:0}: Error finding container a98fe117ed6bbf1e633017e7b9dfff0e2865365b0d98c8774243153238fc8771: Status 404 returned error can't find the container with id a98fe117ed6bbf1e633017e7b9dfff0e2865365b0d98c8774243153238fc8771 Oct 09 08:03:33 crc kubenswrapper[4715]: I1009 08:03:33.819062 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cf8st"] Oct 09 08:03:33 crc kubenswrapper[4715]: W1009 08:03:33.867188 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4e65240_0972_4024_a140_425dda8cfa12.slice/crio-2d4eb8dec5ec669db6ac1e0c765363404c10b8bc7d3cd17c96cef1388189802b WatchSource:0}: Error finding container 2d4eb8dec5ec669db6ac1e0c765363404c10b8bc7d3cd17c96cef1388189802b: Status 404 returned error can't find the container with id 2d4eb8dec5ec669db6ac1e0c765363404c10b8bc7d3cd17c96cef1388189802b Oct 09 08:03:34 crc kubenswrapper[4715]: I1009 08:03:34.061002 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l7r5w" event={"ID":"027df64c-f87d-401f-965c-88c874a854f8","Type":"ContainerStarted","Data":"a98fe117ed6bbf1e633017e7b9dfff0e2865365b0d98c8774243153238fc8771"} Oct 09 08:03:34 crc kubenswrapper[4715]: I1009 08:03:34.063561 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cf8st" event={"ID":"f4e65240-0972-4024-a140-425dda8cfa12","Type":"ContainerStarted","Data":"2d4eb8dec5ec669db6ac1e0c765363404c10b8bc7d3cd17c96cef1388189802b"} Oct 09 08:03:34 crc kubenswrapper[4715]: I1009 08:03:34.071367 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1","Type":"ContainerStarted","Data":"2e03df95b6bb57fb4fb7f4d90cc8e45fe2c8a0dcc0049252dcba3306899bb1cf"} Oct 09 08:03:35 crc kubenswrapper[4715]: I1009 08:03:35.113773 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1","Type":"ContainerStarted","Data":"f66ee5e8144456200dbf81ee9e79d51c129e85e278fa1cd8441aebdf53debc5b"} Oct 09 08:03:35 crc kubenswrapper[4715]: I1009 08:03:35.124215 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fa8c6cfe-dae5-455a-a3f4-83608f3064b5","Type":"ContainerStarted","Data":"1abb0d014478a89f86f7edba03186c78afcc89f7c2def49c541347d933b384c9"} Oct 09 08:03:35 crc kubenswrapper[4715]: I1009 08:03:35.127538 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cf8st" event={"ID":"f4e65240-0972-4024-a140-425dda8cfa12","Type":"ContainerStarted","Data":"4ac55fb52f747d08dc4e6bc380b797c8d2df593c94212a82f0b90bdf9650ecc9"} Oct 09 08:03:35 crc kubenswrapper[4715]: I1009 08:03:35.146915 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.14689645 podStartE2EDuration="5.14689645s" podCreationTimestamp="2025-10-09 08:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:03:35.143481411 +0000 UTC m=+1045.836285429" watchObservedRunningTime="2025-10-09 08:03:35.14689645 +0000 UTC m=+1045.839700458" Oct 09 08:03:35 crc kubenswrapper[4715]: I1009 08:03:35.172827 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.172806016 podStartE2EDuration="5.172806016s" podCreationTimestamp="2025-10-09 08:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:03:35.169554401 +0000 UTC m=+1045.862358409" watchObservedRunningTime="2025-10-09 08:03:35.172806016 +0000 UTC m=+1045.865610024" Oct 09 08:03:35 crc kubenswrapper[4715]: I1009 08:03:35.196370 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-cf8st" podStartSLOduration=3.196353803 podStartE2EDuration="3.196353803s" podCreationTimestamp="2025-10-09 08:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:03:35.189445371 +0000 UTC m=+1045.882249379" watchObservedRunningTime="2025-10-09 08:03:35.196353803 +0000 UTC m=+1045.889157811" Oct 09 08:03:37 crc kubenswrapper[4715]: I1009 08:03:37.147774 4715 generic.go:334] "Generic (PLEG): container finished" podID="189bfb70-4185-415b-a3d9-5d0ed1a76cb0" containerID="5da445758a62da0974a02ce3720b58b9a61359d0cec40acebf20e3f8fac21a3f" exitCode=0 Oct 09 08:03:37 crc kubenswrapper[4715]: I1009 08:03:37.148315 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nf22z" event={"ID":"189bfb70-4185-415b-a3d9-5d0ed1a76cb0","Type":"ContainerDied","Data":"5da445758a62da0974a02ce3720b58b9a61359d0cec40acebf20e3f8fac21a3f"} Oct 09 08:03:38 crc kubenswrapper[4715]: I1009 08:03:38.897126 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:38 crc kubenswrapper[4715]: I1009 08:03:38.898594 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:03:39 crc kubenswrapper[4715]: I1009 08:03:39.026046 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:39 crc kubenswrapper[4715]: I1009 08:03:39.026668 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.640203 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.764126 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw5hf\" (UniqueName: \"kubernetes.io/projected/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-kube-api-access-jw5hf\") pod \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.764271 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-combined-ca-bundle\") pod \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.764355 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-credential-keys\") pod \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.764379 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-fernet-keys\") pod \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.764427 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-scripts\") pod \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.764470 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-config-data\") pod \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\" (UID: \"189bfb70-4185-415b-a3d9-5d0ed1a76cb0\") " Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.770722 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-scripts" (OuterVolumeSpecName: "scripts") pod "189bfb70-4185-415b-a3d9-5d0ed1a76cb0" (UID: "189bfb70-4185-415b-a3d9-5d0ed1a76cb0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.770771 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-kube-api-access-jw5hf" (OuterVolumeSpecName: "kube-api-access-jw5hf") pod "189bfb70-4185-415b-a3d9-5d0ed1a76cb0" (UID: "189bfb70-4185-415b-a3d9-5d0ed1a76cb0"). InnerVolumeSpecName "kube-api-access-jw5hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.772564 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "189bfb70-4185-415b-a3d9-5d0ed1a76cb0" (UID: "189bfb70-4185-415b-a3d9-5d0ed1a76cb0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.772760 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "189bfb70-4185-415b-a3d9-5d0ed1a76cb0" (UID: "189bfb70-4185-415b-a3d9-5d0ed1a76cb0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.802470 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "189bfb70-4185-415b-a3d9-5d0ed1a76cb0" (UID: "189bfb70-4185-415b-a3d9-5d0ed1a76cb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.803856 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-config-data" (OuterVolumeSpecName: "config-data") pod "189bfb70-4185-415b-a3d9-5d0ed1a76cb0" (UID: "189bfb70-4185-415b-a3d9-5d0ed1a76cb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.867143 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw5hf\" (UniqueName: \"kubernetes.io/projected/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-kube-api-access-jw5hf\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.867202 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.867212 4715 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.867221 4715 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.867230 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:40 crc kubenswrapper[4715]: I1009 08:03:40.867238 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/189bfb70-4185-415b-a3d9-5d0ed1a76cb0-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.210540 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nf22z" event={"ID":"189bfb70-4185-415b-a3d9-5d0ed1a76cb0","Type":"ContainerDied","Data":"ec7154bdb5d0a26f771789462a79931958f89d189b80796b1d86042e00a9c196"} Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.210805 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec7154bdb5d0a26f771789462a79931958f89d189b80796b1d86042e00a9c196" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.210606 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nf22z" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.332805 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.332882 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.374411 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.374491 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.379583 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.391625 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.409233 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.430201 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.752798 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-76cc5dfd5-dcg58"] Oct 09 08:03:41 crc kubenswrapper[4715]: E1009 08:03:41.753425 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189bfb70-4185-415b-a3d9-5d0ed1a76cb0" containerName="keystone-bootstrap" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.753465 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="189bfb70-4185-415b-a3d9-5d0ed1a76cb0" containerName="keystone-bootstrap" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.753644 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="189bfb70-4185-415b-a3d9-5d0ed1a76cb0" containerName="keystone-bootstrap" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.754289 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.757114 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.759783 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.759854 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.759876 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.759978 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-ht6l7" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.760089 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.760212 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76cc5dfd5-dcg58"] Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.884519 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6qf2\" (UniqueName: \"kubernetes.io/projected/914f4753-0cbe-4496-b703-8dd106c06db2-kube-api-access-q6qf2\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.884579 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-scripts\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.884609 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-config-data\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.884643 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-combined-ca-bundle\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.884662 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-public-tls-certs\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.884680 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-fernet-keys\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.884714 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-internal-tls-certs\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.884743 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-credential-keys\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.986557 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6qf2\" (UniqueName: \"kubernetes.io/projected/914f4753-0cbe-4496-b703-8dd106c06db2-kube-api-access-q6qf2\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.986614 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-scripts\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.986640 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-config-data\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.986680 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-combined-ca-bundle\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.986712 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-public-tls-certs\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.986729 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-fernet-keys\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.986775 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-internal-tls-certs\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.986803 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-credential-keys\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.993448 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-internal-tls-certs\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.994305 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-combined-ca-bundle\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.994921 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-public-tls-certs\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.995795 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-scripts\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:41 crc kubenswrapper[4715]: I1009 08:03:41.999797 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-credential-keys\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:42 crc kubenswrapper[4715]: I1009 08:03:42.000543 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-config-data\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:42 crc kubenswrapper[4715]: I1009 08:03:42.007106 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6qf2\" (UniqueName: \"kubernetes.io/projected/914f4753-0cbe-4496-b703-8dd106c06db2-kube-api-access-q6qf2\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:42 crc kubenswrapper[4715]: I1009 08:03:42.007282 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/914f4753-0cbe-4496-b703-8dd106c06db2-fernet-keys\") pod \"keystone-76cc5dfd5-dcg58\" (UID: \"914f4753-0cbe-4496-b703-8dd106c06db2\") " pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:42 crc kubenswrapper[4715]: I1009 08:03:42.084954 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:42 crc kubenswrapper[4715]: I1009 08:03:42.231538 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:42 crc kubenswrapper[4715]: I1009 08:03:42.231893 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:42 crc kubenswrapper[4715]: I1009 08:03:42.232719 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:03:42 crc kubenswrapper[4715]: I1009 08:03:42.232744 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 08:03:42 crc kubenswrapper[4715]: I1009 08:03:42.233125 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 08:03:44 crc kubenswrapper[4715]: I1009 08:03:44.249270 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 08:03:44 crc kubenswrapper[4715]: I1009 08:03:44.249306 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 08:03:44 crc kubenswrapper[4715]: I1009 08:03:44.249297 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 08:03:44 crc kubenswrapper[4715]: I1009 08:03:44.249571 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 08:03:44 crc kubenswrapper[4715]: I1009 08:03:44.351646 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:44 crc kubenswrapper[4715]: I1009 08:03:44.353136 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 08:03:44 crc kubenswrapper[4715]: I1009 08:03:44.528036 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 08:03:44 crc kubenswrapper[4715]: I1009 08:03:44.648616 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 08:03:48 crc kubenswrapper[4715]: I1009 08:03:48.898973 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d9885b95b-r2cb2" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 09 08:03:49 crc kubenswrapper[4715]: I1009 08:03:49.028120 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-d788d6d48-5nczq" podUID="7b8b0665-2ab8-4fb9-93ff-6405324f24d5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 09 08:03:54 crc kubenswrapper[4715]: E1009 08:03:54.717862 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 09 08:03:54 crc kubenswrapper[4715]: E1009 08:03:54.718346 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kq68s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-jgbfr_openstack(2f4dee6e-f935-4bdd-9138-d414e86c0fa2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:03:54 crc kubenswrapper[4715]: E1009 08:03:54.720006 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-jgbfr" podUID="2f4dee6e-f935-4bdd-9138-d414e86c0fa2" Oct 09 08:03:55 crc kubenswrapper[4715]: E1009 08:03:55.208693 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 09 08:03:55 crc kubenswrapper[4715]: E1009 08:03:55.208980 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kcjd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-l7r5w_openstack(027df64c-f87d-401f-965c-88c874a854f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:03:55 crc kubenswrapper[4715]: E1009 08:03:55.210324 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-l7r5w" podUID="027df64c-f87d-401f-965c-88c874a854f8" Oct 09 08:03:55 crc kubenswrapper[4715]: E1009 08:03:55.348956 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-l7r5w" podUID="027df64c-f87d-401f-965c-88c874a854f8" Oct 09 08:03:55 crc kubenswrapper[4715]: E1009 08:03:55.349322 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-jgbfr" podUID="2f4dee6e-f935-4bdd-9138-d414e86c0fa2" Oct 09 08:03:55 crc kubenswrapper[4715]: W1009 08:03:55.636090 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod914f4753_0cbe_4496_b703_8dd106c06db2.slice/crio-81cb087592b57c7fee072b3de900868e86b12ba0787eca61dbc41fc9cece8533 WatchSource:0}: Error finding container 81cb087592b57c7fee072b3de900868e86b12ba0787eca61dbc41fc9cece8533: Status 404 returned error can't find the container with id 81cb087592b57c7fee072b3de900868e86b12ba0787eca61dbc41fc9cece8533 Oct 09 08:03:55 crc kubenswrapper[4715]: I1009 08:03:55.636530 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76cc5dfd5-dcg58"] Oct 09 08:03:56 crc kubenswrapper[4715]: I1009 08:03:56.356737 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76cc5dfd5-dcg58" event={"ID":"914f4753-0cbe-4496-b703-8dd106c06db2","Type":"ContainerStarted","Data":"63230d1dd661b33b42ef1fdefc5e1ed901a6cc9642b34c8d624bd5eb902bbab7"} Oct 09 08:03:56 crc kubenswrapper[4715]: I1009 08:03:56.357230 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76cc5dfd5-dcg58" event={"ID":"914f4753-0cbe-4496-b703-8dd106c06db2","Type":"ContainerStarted","Data":"81cb087592b57c7fee072b3de900868e86b12ba0787eca61dbc41fc9cece8533"} Oct 09 08:03:56 crc kubenswrapper[4715]: I1009 08:03:56.357361 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:03:56 crc kubenswrapper[4715]: I1009 08:03:56.358123 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4tnj8" event={"ID":"80ba490d-aaff-4579-bc8a-ffaa4924c7b7","Type":"ContainerStarted","Data":"e2be2a811cd6f5a203c67031da13097a046626e4d8037f8db0c8d330f3721c3b"} Oct 09 08:03:56 crc kubenswrapper[4715]: I1009 08:03:56.360390 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad","Type":"ContainerStarted","Data":"5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9"} Oct 09 08:03:56 crc kubenswrapper[4715]: I1009 08:03:56.382626 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-76cc5dfd5-dcg58" podStartSLOduration=15.382608254 podStartE2EDuration="15.382608254s" podCreationTimestamp="2025-10-09 08:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:03:56.377447544 +0000 UTC m=+1067.070251562" watchObservedRunningTime="2025-10-09 08:03:56.382608254 +0000 UTC m=+1067.075412262" Oct 09 08:03:56 crc kubenswrapper[4715]: I1009 08:03:56.397144 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-4tnj8" podStartSLOduration=2.687203839 podStartE2EDuration="49.397125568s" podCreationTimestamp="2025-10-09 08:03:07 +0000 UTC" firstStartedPulling="2025-10-09 08:03:08.546556982 +0000 UTC m=+1019.239360990" lastFinishedPulling="2025-10-09 08:03:55.256478711 +0000 UTC m=+1065.949282719" observedRunningTime="2025-10-09 08:03:56.393483172 +0000 UTC m=+1067.086287180" watchObservedRunningTime="2025-10-09 08:03:56.397125568 +0000 UTC m=+1067.089929576" Oct 09 08:03:57 crc kubenswrapper[4715]: I1009 08:03:57.371312 4715 generic.go:334] "Generic (PLEG): container finished" podID="80ba490d-aaff-4579-bc8a-ffaa4924c7b7" containerID="e2be2a811cd6f5a203c67031da13097a046626e4d8037f8db0c8d330f3721c3b" exitCode=0 Oct 09 08:03:57 crc kubenswrapper[4715]: I1009 08:03:57.371392 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4tnj8" event={"ID":"80ba490d-aaff-4579-bc8a-ffaa4924c7b7","Type":"ContainerDied","Data":"e2be2a811cd6f5a203c67031da13097a046626e4d8037f8db0c8d330f3721c3b"} Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.718924 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.788851 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4tnj8" Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.867121 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-scripts\") pod \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.867187 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-config-data\") pod \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.867265 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-logs\") pod \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.867342 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-combined-ca-bundle\") pod \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.867399 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkwgr\" (UniqueName: \"kubernetes.io/projected/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-kube-api-access-nkwgr\") pod \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\" (UID: \"80ba490d-aaff-4579-bc8a-ffaa4924c7b7\") " Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.867863 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-logs" (OuterVolumeSpecName: "logs") pod "80ba490d-aaff-4579-bc8a-ffaa4924c7b7" (UID: "80ba490d-aaff-4579-bc8a-ffaa4924c7b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.876840 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-scripts" (OuterVolumeSpecName: "scripts") pod "80ba490d-aaff-4579-bc8a-ffaa4924c7b7" (UID: "80ba490d-aaff-4579-bc8a-ffaa4924c7b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.884696 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-kube-api-access-nkwgr" (OuterVolumeSpecName: "kube-api-access-nkwgr") pod "80ba490d-aaff-4579-bc8a-ffaa4924c7b7" (UID: "80ba490d-aaff-4579-bc8a-ffaa4924c7b7"). InnerVolumeSpecName "kube-api-access-nkwgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.894356 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80ba490d-aaff-4579-bc8a-ffaa4924c7b7" (UID: "80ba490d-aaff-4579-bc8a-ffaa4924c7b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.899566 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-config-data" (OuterVolumeSpecName: "config-data") pod "80ba490d-aaff-4579-bc8a-ffaa4924c7b7" (UID: "80ba490d-aaff-4579-bc8a-ffaa4924c7b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.969538 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.969573 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.969585 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.969594 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:00 crc kubenswrapper[4715]: I1009 08:04:00.969605 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkwgr\" (UniqueName: \"kubernetes.io/projected/80ba490d-aaff-4579-bc8a-ffaa4924c7b7-kube-api-access-nkwgr\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.073650 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.411743 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4tnj8" event={"ID":"80ba490d-aaff-4579-bc8a-ffaa4924c7b7","Type":"ContainerDied","Data":"07c89b039feb8a8c7c145de8622119db7701e69b90a72e72f5e211fef56b57e6"} Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.411824 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4tnj8" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.411841 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07c89b039feb8a8c7c145de8622119db7701e69b90a72e72f5e211fef56b57e6" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.932973 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6985f6958d-wwgg5"] Oct 09 08:04:01 crc kubenswrapper[4715]: E1009 08:04:01.933896 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ba490d-aaff-4579-bc8a-ffaa4924c7b7" containerName="placement-db-sync" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.933926 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ba490d-aaff-4579-bc8a-ffaa4924c7b7" containerName="placement-db-sync" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.934335 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="80ba490d-aaff-4579-bc8a-ffaa4924c7b7" containerName="placement-db-sync" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.937739 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.945127 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.945373 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.945562 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gr7d6" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.946439 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.947184 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.951830 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6985f6958d-wwgg5"] Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.987861 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-combined-ca-bundle\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.987910 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-public-tls-certs\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.987943 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-internal-tls-certs\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.988010 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqhhl\" (UniqueName: \"kubernetes.io/projected/45254b90-e09e-425e-b7c2-123813b82b37-kube-api-access-zqhhl\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.988032 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-scripts\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.988070 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45254b90-e09e-425e-b7c2-123813b82b37-logs\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:01 crc kubenswrapper[4715]: I1009 08:04:01.988090 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-config-data\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.089381 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqhhl\" (UniqueName: \"kubernetes.io/projected/45254b90-e09e-425e-b7c2-123813b82b37-kube-api-access-zqhhl\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.089450 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-scripts\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.089501 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45254b90-e09e-425e-b7c2-123813b82b37-logs\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.089524 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-config-data\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.089570 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-combined-ca-bundle\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.089592 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-public-tls-certs\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.089627 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-internal-tls-certs\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.094604 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45254b90-e09e-425e-b7c2-123813b82b37-logs\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.095882 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-combined-ca-bundle\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.097944 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-public-tls-certs\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.099204 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-internal-tls-certs\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.107195 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-config-data\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.118007 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45254b90-e09e-425e-b7c2-123813b82b37-scripts\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.130763 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqhhl\" (UniqueName: \"kubernetes.io/projected/45254b90-e09e-425e-b7c2-123813b82b37-kube-api-access-zqhhl\") pod \"placement-6985f6958d-wwgg5\" (UID: \"45254b90-e09e-425e-b7c2-123813b82b37\") " pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.278902 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:02 crc kubenswrapper[4715]: I1009 08:04:02.461002 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.068123 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-d788d6d48-5nczq" Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.142613 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d9885b95b-r2cb2"] Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.449480 4715 generic.go:334] "Generic (PLEG): container finished" podID="347422fd-611a-4d6e-bccb-031c6f308b5f" containerID="d060216fc887a09febcae2fc6caf87f7b8c0148fb07de6cbaf52d75ba0cb9504" exitCode=137 Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.450039 4715 generic.go:334] "Generic (PLEG): container finished" podID="347422fd-611a-4d6e-bccb-031c6f308b5f" containerID="ddab14ccb5f324df95c91360731e39410a5dd9dbe1153aa8f7dc965ce6c09a86" exitCode=137 Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.449983 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f4678dcc-bh6m5" event={"ID":"347422fd-611a-4d6e-bccb-031c6f308b5f","Type":"ContainerDied","Data":"d060216fc887a09febcae2fc6caf87f7b8c0148fb07de6cbaf52d75ba0cb9504"} Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.450134 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f4678dcc-bh6m5" event={"ID":"347422fd-611a-4d6e-bccb-031c6f308b5f","Type":"ContainerDied","Data":"ddab14ccb5f324df95c91360731e39410a5dd9dbe1153aa8f7dc965ce6c09a86"} Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.450253 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d9885b95b-r2cb2" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerName="horizon-log" containerID="cri-o://30df9f4b9017305ddf0ba538bc4401ae44d2c84af182eb159fea1745dd9773d7" gracePeriod=30 Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.450579 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5d9885b95b-r2cb2" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerName="horizon" containerID="cri-o://a38b8e8bdafd561f301d7de3849df03039ec334d2e9911426537e38b58e93a3e" gracePeriod=30 Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.541180 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.722034 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/347422fd-611a-4d6e-bccb-031c6f308b5f-horizon-secret-key\") pod \"347422fd-611a-4d6e-bccb-031c6f308b5f\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.722100 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/347422fd-611a-4d6e-bccb-031c6f308b5f-config-data\") pod \"347422fd-611a-4d6e-bccb-031c6f308b5f\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.722123 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/347422fd-611a-4d6e-bccb-031c6f308b5f-scripts\") pod \"347422fd-611a-4d6e-bccb-031c6f308b5f\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.722230 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlhw2\" (UniqueName: \"kubernetes.io/projected/347422fd-611a-4d6e-bccb-031c6f308b5f-kube-api-access-nlhw2\") pod \"347422fd-611a-4d6e-bccb-031c6f308b5f\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.722340 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/347422fd-611a-4d6e-bccb-031c6f308b5f-logs\") pod \"347422fd-611a-4d6e-bccb-031c6f308b5f\" (UID: \"347422fd-611a-4d6e-bccb-031c6f308b5f\") " Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.723196 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/347422fd-611a-4d6e-bccb-031c6f308b5f-logs" (OuterVolumeSpecName: "logs") pod "347422fd-611a-4d6e-bccb-031c6f308b5f" (UID: "347422fd-611a-4d6e-bccb-031c6f308b5f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.728059 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/347422fd-611a-4d6e-bccb-031c6f308b5f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "347422fd-611a-4d6e-bccb-031c6f308b5f" (UID: "347422fd-611a-4d6e-bccb-031c6f308b5f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.728114 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/347422fd-611a-4d6e-bccb-031c6f308b5f-kube-api-access-nlhw2" (OuterVolumeSpecName: "kube-api-access-nlhw2") pod "347422fd-611a-4d6e-bccb-031c6f308b5f" (UID: "347422fd-611a-4d6e-bccb-031c6f308b5f"). InnerVolumeSpecName "kube-api-access-nlhw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.750453 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/347422fd-611a-4d6e-bccb-031c6f308b5f-config-data" (OuterVolumeSpecName: "config-data") pod "347422fd-611a-4d6e-bccb-031c6f308b5f" (UID: "347422fd-611a-4d6e-bccb-031c6f308b5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.751853 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/347422fd-611a-4d6e-bccb-031c6f308b5f-scripts" (OuterVolumeSpecName: "scripts") pod "347422fd-611a-4d6e-bccb-031c6f308b5f" (UID: "347422fd-611a-4d6e-bccb-031c6f308b5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.776053 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6985f6958d-wwgg5"] Oct 09 08:04:03 crc kubenswrapper[4715]: W1009 08:04:03.786055 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45254b90_e09e_425e_b7c2_123813b82b37.slice/crio-6ac4f905e246008ca6385977d00fe28b85a7f63e8e3b546c7da6f0f594998de3 WatchSource:0}: Error finding container 6ac4f905e246008ca6385977d00fe28b85a7f63e8e3b546c7da6f0f594998de3: Status 404 returned error can't find the container with id 6ac4f905e246008ca6385977d00fe28b85a7f63e8e3b546c7da6f0f594998de3 Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.824017 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/347422fd-611a-4d6e-bccb-031c6f308b5f-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.824060 4715 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/347422fd-611a-4d6e-bccb-031c6f308b5f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.824075 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/347422fd-611a-4d6e-bccb-031c6f308b5f-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.824086 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/347422fd-611a-4d6e-bccb-031c6f308b5f-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:03 crc kubenswrapper[4715]: I1009 08:04:03.824096 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlhw2\" (UniqueName: \"kubernetes.io/projected/347422fd-611a-4d6e-bccb-031c6f308b5f-kube-api-access-nlhw2\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.463617 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="ceilometer-central-agent" containerID="cri-o://00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a" gracePeriod=30 Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.463652 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="proxy-httpd" containerID="cri-o://cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561" gracePeriod=30 Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.463667 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="sg-core" containerID="cri-o://5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9" gracePeriod=30 Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.463546 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad","Type":"ContainerStarted","Data":"cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561"} Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.464213 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.463715 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="ceilometer-notification-agent" containerID="cri-o://afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958" gracePeriod=30 Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.468739 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9f4678dcc-bh6m5" Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.469609 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f4678dcc-bh6m5" event={"ID":"347422fd-611a-4d6e-bccb-031c6f308b5f","Type":"ContainerDied","Data":"515932b11c9ed1c5d9b48a75f3c8e0059ba021512a4d817938ce824b32d16656"} Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.469645 4715 scope.go:117] "RemoveContainer" containerID="d060216fc887a09febcae2fc6caf87f7b8c0148fb07de6cbaf52d75ba0cb9504" Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.474487 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6985f6958d-wwgg5" event={"ID":"45254b90-e09e-425e-b7c2-123813b82b37","Type":"ContainerStarted","Data":"d77fe783c20c8735177e49c16ec54c57d39e53b4369bb1fd98957816823ab36a"} Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.474535 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6985f6958d-wwgg5" event={"ID":"45254b90-e09e-425e-b7c2-123813b82b37","Type":"ContainerStarted","Data":"8e7b41fb7750d5fbc64df791a739836a262c7bede411d66f7148657c959fda35"} Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.474550 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6985f6958d-wwgg5" event={"ID":"45254b90-e09e-425e-b7c2-123813b82b37","Type":"ContainerStarted","Data":"6ac4f905e246008ca6385977d00fe28b85a7f63e8e3b546c7da6f0f594998de3"} Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.474709 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.474748 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.485447 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.475136482 podStartE2EDuration="58.485432246s" podCreationTimestamp="2025-10-09 08:03:06 +0000 UTC" firstStartedPulling="2025-10-09 08:03:08.275073184 +0000 UTC m=+1018.967877192" lastFinishedPulling="2025-10-09 08:04:03.285368948 +0000 UTC m=+1073.978172956" observedRunningTime="2025-10-09 08:04:04.483763488 +0000 UTC m=+1075.176567496" watchObservedRunningTime="2025-10-09 08:04:04.485432246 +0000 UTC m=+1075.178236254" Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.508180 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9f4678dcc-bh6m5"] Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.518140 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9f4678dcc-bh6m5"] Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.525755 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6985f6958d-wwgg5" podStartSLOduration=3.525731112 podStartE2EDuration="3.525731112s" podCreationTimestamp="2025-10-09 08:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:04.518530312 +0000 UTC m=+1075.211334340" watchObservedRunningTime="2025-10-09 08:04:04.525731112 +0000 UTC m=+1075.218535120" Oct 09 08:04:04 crc kubenswrapper[4715]: I1009 08:04:04.639564 4715 scope.go:117] "RemoveContainer" containerID="ddab14ccb5f324df95c91360731e39410a5dd9dbe1153aa8f7dc965ce6c09a86" Oct 09 08:04:05 crc kubenswrapper[4715]: I1009 08:04:05.486977 4715 generic.go:334] "Generic (PLEG): container finished" podID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerID="cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561" exitCode=0 Oct 09 08:04:05 crc kubenswrapper[4715]: I1009 08:04:05.487014 4715 generic.go:334] "Generic (PLEG): container finished" podID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerID="5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9" exitCode=2 Oct 09 08:04:05 crc kubenswrapper[4715]: I1009 08:04:05.487026 4715 generic.go:334] "Generic (PLEG): container finished" podID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerID="00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a" exitCode=0 Oct 09 08:04:05 crc kubenswrapper[4715]: I1009 08:04:05.487082 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad","Type":"ContainerDied","Data":"cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561"} Oct 09 08:04:05 crc kubenswrapper[4715]: I1009 08:04:05.487138 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad","Type":"ContainerDied","Data":"5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9"} Oct 09 08:04:05 crc kubenswrapper[4715]: I1009 08:04:05.487155 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad","Type":"ContainerDied","Data":"00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a"} Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.058588 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.151273 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="347422fd-611a-4d6e-bccb-031c6f308b5f" path="/var/lib/kubelet/pods/347422fd-611a-4d6e-bccb-031c6f308b5f/volumes" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.159443 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7b4l\" (UniqueName: \"kubernetes.io/projected/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-kube-api-access-g7b4l\") pod \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.159523 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-combined-ca-bundle\") pod \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.159594 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-run-httpd\") pod \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.159624 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-sg-core-conf-yaml\") pod \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.159817 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-log-httpd\") pod \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.159865 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-scripts\") pod \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.159908 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-config-data\") pod \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\" (UID: \"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad\") " Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.160578 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" (UID: "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.160851 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" (UID: "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.161228 4715 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.161249 4715 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.164993 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-kube-api-access-g7b4l" (OuterVolumeSpecName: "kube-api-access-g7b4l") pod "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" (UID: "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad"). InnerVolumeSpecName "kube-api-access-g7b4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.174684 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-scripts" (OuterVolumeSpecName: "scripts") pod "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" (UID: "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.185226 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" (UID: "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.224237 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" (UID: "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.248107 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-config-data" (OuterVolumeSpecName: "config-data") pod "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" (UID: "9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.263487 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7b4l\" (UniqueName: \"kubernetes.io/projected/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-kube-api-access-g7b4l\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.263527 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.263540 4715 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.263553 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.263563 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.504100 4715 generic.go:334] "Generic (PLEG): container finished" podID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerID="afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958" exitCode=0 Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.504149 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad","Type":"ContainerDied","Data":"afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958"} Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.504179 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad","Type":"ContainerDied","Data":"405c0a6fcd0c6ba5702e86c8bbb90ab12e824d86a7b1bf82d07dec80003ffbfd"} Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.504201 4715 scope.go:117] "RemoveContainer" containerID="cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.504202 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.522277 4715 scope.go:117] "RemoveContainer" containerID="5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.546749 4715 scope.go:117] "RemoveContainer" containerID="afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.555510 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.572654 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.591993 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:06 crc kubenswrapper[4715]: E1009 08:04:06.592443 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="347422fd-611a-4d6e-bccb-031c6f308b5f" containerName="horizon" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.592463 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="347422fd-611a-4d6e-bccb-031c6f308b5f" containerName="horizon" Oct 09 08:04:06 crc kubenswrapper[4715]: E1009 08:04:06.592484 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="proxy-httpd" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.592492 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="proxy-httpd" Oct 09 08:04:06 crc kubenswrapper[4715]: E1009 08:04:06.592512 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="347422fd-611a-4d6e-bccb-031c6f308b5f" containerName="horizon-log" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.592520 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="347422fd-611a-4d6e-bccb-031c6f308b5f" containerName="horizon-log" Oct 09 08:04:06 crc kubenswrapper[4715]: E1009 08:04:06.592541 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="sg-core" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.592549 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="sg-core" Oct 09 08:04:06 crc kubenswrapper[4715]: E1009 08:04:06.592569 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="ceilometer-central-agent" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.592577 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="ceilometer-central-agent" Oct 09 08:04:06 crc kubenswrapper[4715]: E1009 08:04:06.592596 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="ceilometer-notification-agent" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.592604 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="ceilometer-notification-agent" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.592799 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="proxy-httpd" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.592812 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="ceilometer-notification-agent" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.592825 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="347422fd-611a-4d6e-bccb-031c6f308b5f" containerName="horizon-log" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.592840 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="sg-core" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.592855 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="347422fd-611a-4d6e-bccb-031c6f308b5f" containerName="horizon" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.592868 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" containerName="ceilometer-central-agent" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.594891 4715 scope.go:117] "RemoveContainer" containerID="00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.595320 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.598989 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.599148 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.601794 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.626345 4715 scope.go:117] "RemoveContainer" containerID="cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561" Oct 09 08:04:06 crc kubenswrapper[4715]: E1009 08:04:06.628506 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561\": container with ID starting with cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561 not found: ID does not exist" containerID="cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.628555 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561"} err="failed to get container status \"cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561\": rpc error: code = NotFound desc = could not find container \"cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561\": container with ID starting with cc0326aae48e70309fd15d662feb316608ef3353477ed329b10b722b153e7561 not found: ID does not exist" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.628582 4715 scope.go:117] "RemoveContainer" containerID="5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9" Oct 09 08:04:06 crc kubenswrapper[4715]: E1009 08:04:06.628885 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9\": container with ID starting with 5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9 not found: ID does not exist" containerID="5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.628910 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9"} err="failed to get container status \"5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9\": rpc error: code = NotFound desc = could not find container \"5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9\": container with ID starting with 5ee7727a23a73319d21bce8841e18f00ef454dd6da5b2c75a2a1efbae66ea2f9 not found: ID does not exist" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.628924 4715 scope.go:117] "RemoveContainer" containerID="afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958" Oct 09 08:04:06 crc kubenswrapper[4715]: E1009 08:04:06.629127 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958\": container with ID starting with afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958 not found: ID does not exist" containerID="afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.629149 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958"} err="failed to get container status \"afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958\": rpc error: code = NotFound desc = could not find container \"afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958\": container with ID starting with afec23b6411a3a1f8e57fba677626d2a491380ce461e4852aa47d9ac29d7a958 not found: ID does not exist" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.629164 4715 scope.go:117] "RemoveContainer" containerID="00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a" Oct 09 08:04:06 crc kubenswrapper[4715]: E1009 08:04:06.629454 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a\": container with ID starting with 00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a not found: ID does not exist" containerID="00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.629481 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a"} err="failed to get container status \"00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a\": rpc error: code = NotFound desc = could not find container \"00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a\": container with ID starting with 00d85478a6b43d631daeadc593fcad13de7b583f5346a00bccae21ba3746ee0a not found: ID does not exist" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.780285 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9c0be8b-0711-488f-9bb1-edee8e92e527-run-httpd\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.780367 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9c0be8b-0711-488f-9bb1-edee8e92e527-log-httpd\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.780435 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.780519 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-scripts\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.780556 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-config-data\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.780598 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn5bn\" (UniqueName: \"kubernetes.io/projected/f9c0be8b-0711-488f-9bb1-edee8e92e527-kube-api-access-jn5bn\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.780639 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.882291 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn5bn\" (UniqueName: \"kubernetes.io/projected/f9c0be8b-0711-488f-9bb1-edee8e92e527-kube-api-access-jn5bn\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.882352 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.882406 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9c0be8b-0711-488f-9bb1-edee8e92e527-run-httpd\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.882465 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9c0be8b-0711-488f-9bb1-edee8e92e527-log-httpd\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.882502 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.882554 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-scripts\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.882588 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-config-data\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.883394 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9c0be8b-0711-488f-9bb1-edee8e92e527-run-httpd\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.884389 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9c0be8b-0711-488f-9bb1-edee8e92e527-log-httpd\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.886774 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.887152 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-config-data\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.887685 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.888412 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-scripts\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.899293 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn5bn\" (UniqueName: \"kubernetes.io/projected/f9c0be8b-0711-488f-9bb1-edee8e92e527-kube-api-access-jn5bn\") pod \"ceilometer-0\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " pod="openstack/ceilometer-0" Oct 09 08:04:06 crc kubenswrapper[4715]: I1009 08:04:06.928616 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:07 crc kubenswrapper[4715]: I1009 08:04:07.354763 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:07 crc kubenswrapper[4715]: W1009 08:04:07.358832 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9c0be8b_0711_488f_9bb1_edee8e92e527.slice/crio-94fda7de3823f725a433b7648fc555ae46934b8733034688c13fdfe015b84798 WatchSource:0}: Error finding container 94fda7de3823f725a433b7648fc555ae46934b8733034688c13fdfe015b84798: Status 404 returned error can't find the container with id 94fda7de3823f725a433b7648fc555ae46934b8733034688c13fdfe015b84798 Oct 09 08:04:07 crc kubenswrapper[4715]: I1009 08:04:07.516553 4715 generic.go:334] "Generic (PLEG): container finished" podID="f4e65240-0972-4024-a140-425dda8cfa12" containerID="4ac55fb52f747d08dc4e6bc380b797c8d2df593c94212a82f0b90bdf9650ecc9" exitCode=0 Oct 09 08:04:07 crc kubenswrapper[4715]: I1009 08:04:07.516621 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cf8st" event={"ID":"f4e65240-0972-4024-a140-425dda8cfa12","Type":"ContainerDied","Data":"4ac55fb52f747d08dc4e6bc380b797c8d2df593c94212a82f0b90bdf9650ecc9"} Oct 09 08:04:07 crc kubenswrapper[4715]: I1009 08:04:07.520549 4715 generic.go:334] "Generic (PLEG): container finished" podID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerID="a38b8e8bdafd561f301d7de3849df03039ec334d2e9911426537e38b58e93a3e" exitCode=0 Oct 09 08:04:07 crc kubenswrapper[4715]: I1009 08:04:07.520611 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d9885b95b-r2cb2" event={"ID":"ada9982a-fc5f-4c93-bfa3-3401c0824c2e","Type":"ContainerDied","Data":"a38b8e8bdafd561f301d7de3849df03039ec334d2e9911426537e38b58e93a3e"} Oct 09 08:04:07 crc kubenswrapper[4715]: I1009 08:04:07.522261 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jgbfr" event={"ID":"2f4dee6e-f935-4bdd-9138-d414e86c0fa2","Type":"ContainerStarted","Data":"b8ed99a425aa9c369f9b95b06ae7d8299eff1ee29789b000aa1a34966e3d750b"} Oct 09 08:04:07 crc kubenswrapper[4715]: I1009 08:04:07.523788 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9c0be8b-0711-488f-9bb1-edee8e92e527","Type":"ContainerStarted","Data":"94fda7de3823f725a433b7648fc555ae46934b8733034688c13fdfe015b84798"} Oct 09 08:04:07 crc kubenswrapper[4715]: I1009 08:04:07.550241 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jgbfr" podStartSLOduration=19.042793557 podStartE2EDuration="55.550224409s" podCreationTimestamp="2025-10-09 08:03:12 +0000 UTC" firstStartedPulling="2025-10-09 08:03:30.320872093 +0000 UTC m=+1041.013676101" lastFinishedPulling="2025-10-09 08:04:06.828302945 +0000 UTC m=+1077.521106953" observedRunningTime="2025-10-09 08:04:07.54682579 +0000 UTC m=+1078.239629798" watchObservedRunningTime="2025-10-09 08:04:07.550224409 +0000 UTC m=+1078.243028417" Oct 09 08:04:08 crc kubenswrapper[4715]: I1009 08:04:08.149302 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad" path="/var/lib/kubelet/pods/9cd73b9e-b01b-4148-8a2b-6bc9c89cf0ad/volumes" Oct 09 08:04:08 crc kubenswrapper[4715]: I1009 08:04:08.533581 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9c0be8b-0711-488f-9bb1-edee8e92e527","Type":"ContainerStarted","Data":"06e3a016ccf1feffe4e8bbc3185ff5bb6f65ac453d4e58b04b827106491f1919"} Oct 09 08:04:08 crc kubenswrapper[4715]: I1009 08:04:08.898038 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d9885b95b-r2cb2" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 09 08:04:08 crc kubenswrapper[4715]: I1009 08:04:08.949834 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cf8st" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.118839 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4e65240-0972-4024-a140-425dda8cfa12-config\") pod \"f4e65240-0972-4024-a140-425dda8cfa12\" (UID: \"f4e65240-0972-4024-a140-425dda8cfa12\") " Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.119043 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9vxh\" (UniqueName: \"kubernetes.io/projected/f4e65240-0972-4024-a140-425dda8cfa12-kube-api-access-c9vxh\") pod \"f4e65240-0972-4024-a140-425dda8cfa12\" (UID: \"f4e65240-0972-4024-a140-425dda8cfa12\") " Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.119110 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e65240-0972-4024-a140-425dda8cfa12-combined-ca-bundle\") pod \"f4e65240-0972-4024-a140-425dda8cfa12\" (UID: \"f4e65240-0972-4024-a140-425dda8cfa12\") " Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.124052 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e65240-0972-4024-a140-425dda8cfa12-kube-api-access-c9vxh" (OuterVolumeSpecName: "kube-api-access-c9vxh") pod "f4e65240-0972-4024-a140-425dda8cfa12" (UID: "f4e65240-0972-4024-a140-425dda8cfa12"). InnerVolumeSpecName "kube-api-access-c9vxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.151253 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e65240-0972-4024-a140-425dda8cfa12-config" (OuterVolumeSpecName: "config") pod "f4e65240-0972-4024-a140-425dda8cfa12" (UID: "f4e65240-0972-4024-a140-425dda8cfa12"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.152311 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e65240-0972-4024-a140-425dda8cfa12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4e65240-0972-4024-a140-425dda8cfa12" (UID: "f4e65240-0972-4024-a140-425dda8cfa12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.220645 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9vxh\" (UniqueName: \"kubernetes.io/projected/f4e65240-0972-4024-a140-425dda8cfa12-kube-api-access-c9vxh\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.220681 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4e65240-0972-4024-a140-425dda8cfa12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.220692 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4e65240-0972-4024-a140-425dda8cfa12-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.548830 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9c0be8b-0711-488f-9bb1-edee8e92e527","Type":"ContainerStarted","Data":"179f31ef5d83b939a24a6248b3f608ec2de7bb69229edbfd596d0fe2ec54c2e1"} Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.553641 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cf8st" event={"ID":"f4e65240-0972-4024-a140-425dda8cfa12","Type":"ContainerDied","Data":"2d4eb8dec5ec669db6ac1e0c765363404c10b8bc7d3cd17c96cef1388189802b"} Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.553679 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d4eb8dec5ec669db6ac1e0c765363404c10b8bc7d3cd17c96cef1388189802b" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.553745 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cf8st" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.692789 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-ppstn"] Oct 09 08:04:09 crc kubenswrapper[4715]: E1009 08:04:09.696080 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e65240-0972-4024-a140-425dda8cfa12" containerName="neutron-db-sync" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.696126 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e65240-0972-4024-a140-425dda8cfa12" containerName="neutron-db-sync" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.696645 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e65240-0972-4024-a140-425dda8cfa12" containerName="neutron-db-sync" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.697896 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.729305 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-ppstn"] Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.780034 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d4d4f746b-b9w44"] Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.790980 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.793438 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wfq5s" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.793800 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.795376 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.796226 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.800622 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d4d4f746b-b9w44"] Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.850378 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.850478 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwxb9\" (UniqueName: \"kubernetes.io/projected/5984d04d-832a-4767-8966-96b3edc181e9-kube-api-access-qwxb9\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.850523 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.850578 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.850623 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-config\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.850661 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.951959 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qsbs\" (UniqueName: \"kubernetes.io/projected/ca660681-8c14-4632-91ee-9abeeb3ef48e-kube-api-access-5qsbs\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.952031 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-ovndb-tls-certs\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.952060 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-combined-ca-bundle\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.952092 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-config\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.952275 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.952346 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-httpd-config\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.952434 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-config\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.952515 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.952624 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwxb9\" (UniqueName: \"kubernetes.io/projected/5984d04d-832a-4767-8966-96b3edc181e9-kube-api-access-qwxb9\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.952685 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.952811 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.953444 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-dns-swift-storage-0\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.953447 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-config\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.953546 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-ovsdbserver-sb\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.953864 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-ovsdbserver-nb\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:09 crc kubenswrapper[4715]: I1009 08:04:09.954081 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-dns-svc\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.001836 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwxb9\" (UniqueName: \"kubernetes.io/projected/5984d04d-832a-4767-8966-96b3edc181e9-kube-api-access-qwxb9\") pod \"dnsmasq-dns-84b966f6c9-ppstn\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.032534 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.054254 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qsbs\" (UniqueName: \"kubernetes.io/projected/ca660681-8c14-4632-91ee-9abeeb3ef48e-kube-api-access-5qsbs\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.054314 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-ovndb-tls-certs\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.054340 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-combined-ca-bundle\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.054402 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-httpd-config\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.054461 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-config\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.057743 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.056055 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.058087 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.075256 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-combined-ca-bundle\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.084314 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-ovndb-tls-certs\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.085714 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-config\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.089403 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qsbs\" (UniqueName: \"kubernetes.io/projected/ca660681-8c14-4632-91ee-9abeeb3ef48e-kube-api-access-5qsbs\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.096372 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-httpd-config\") pod \"neutron-5d4d4f746b-b9w44\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.120138 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wfq5s" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.133584 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.553328 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-ppstn"] Oct 09 08:04:10 crc kubenswrapper[4715]: I1009 08:04:10.572118 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9c0be8b-0711-488f-9bb1-edee8e92e527","Type":"ContainerStarted","Data":"a412f5011889d1602ee9c6e8425ae5cfc6ec79df9d2ceeb5e45194df1b5ca559"} Oct 09 08:04:11 crc kubenswrapper[4715]: I1009 08:04:11.110810 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d4d4f746b-b9w44"] Oct 09 08:04:11 crc kubenswrapper[4715]: I1009 08:04:11.595934 4715 generic.go:334] "Generic (PLEG): container finished" podID="5984d04d-832a-4767-8966-96b3edc181e9" containerID="203bb0639160619326bf2a1463ea16d0e262f215c0bd217c742e41fc1341a8b9" exitCode=0 Oct 09 08:04:11 crc kubenswrapper[4715]: I1009 08:04:11.596313 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" event={"ID":"5984d04d-832a-4767-8966-96b3edc181e9","Type":"ContainerDied","Data":"203bb0639160619326bf2a1463ea16d0e262f215c0bd217c742e41fc1341a8b9"} Oct 09 08:04:11 crc kubenswrapper[4715]: I1009 08:04:11.596361 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" event={"ID":"5984d04d-832a-4767-8966-96b3edc181e9","Type":"ContainerStarted","Data":"4431672a85c02f9340afdfb5e61ecea8f68bb7f028fb30d17171ad3b60cadd89"} Oct 09 08:04:11 crc kubenswrapper[4715]: I1009 08:04:11.604309 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d4d4f746b-b9w44" event={"ID":"ca660681-8c14-4632-91ee-9abeeb3ef48e","Type":"ContainerStarted","Data":"4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807"} Oct 09 08:04:11 crc kubenswrapper[4715]: I1009 08:04:11.604385 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d4d4f746b-b9w44" event={"ID":"ca660681-8c14-4632-91ee-9abeeb3ef48e","Type":"ContainerStarted","Data":"9644c01f63899ca6a6da09769f0fd0993a46a4f4c7e0cefbf3b111e16303bcaa"} Oct 09 08:04:11 crc kubenswrapper[4715]: I1009 08:04:11.633530 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9c0be8b-0711-488f-9bb1-edee8e92e527","Type":"ContainerStarted","Data":"e0875b29b7c74d81e763ff018a1895e8fd2652900b8e9a903413e7c40d8540b5"} Oct 09 08:04:11 crc kubenswrapper[4715]: I1009 08:04:11.633745 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 08:04:11 crc kubenswrapper[4715]: I1009 08:04:11.656152 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.689306074 podStartE2EDuration="5.656125834s" podCreationTimestamp="2025-10-09 08:04:06 +0000 UTC" firstStartedPulling="2025-10-09 08:04:07.361654389 +0000 UTC m=+1078.054458397" lastFinishedPulling="2025-10-09 08:04:11.328474149 +0000 UTC m=+1082.021278157" observedRunningTime="2025-10-09 08:04:11.654583239 +0000 UTC m=+1082.347387268" watchObservedRunningTime="2025-10-09 08:04:11.656125834 +0000 UTC m=+1082.348929842" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.288156 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-85df7c4d7c-7ktz2"] Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.289761 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.292844 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.293007 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.299393 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85df7c4d7c-7ktz2"] Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.332116 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-httpd-config\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.332184 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crcrx\" (UniqueName: \"kubernetes.io/projected/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-kube-api-access-crcrx\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.332329 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-combined-ca-bundle\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.332386 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-internal-tls-certs\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.332841 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-ovndb-tls-certs\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.332976 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-config\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.333010 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-public-tls-certs\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.435896 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-ovndb-tls-certs\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.437795 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-config\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.437898 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-public-tls-certs\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.437971 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-httpd-config\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.438007 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crcrx\" (UniqueName: \"kubernetes.io/projected/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-kube-api-access-crcrx\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.438131 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-combined-ca-bundle\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.438187 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-internal-tls-certs\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.441249 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-ovndb-tls-certs\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.444498 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-config\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.445486 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-httpd-config\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.446123 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-public-tls-certs\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.448277 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-internal-tls-certs\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.451112 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-combined-ca-bundle\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.460947 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crcrx\" (UniqueName: \"kubernetes.io/projected/03dac8b3-a92c-49b7-94cd-f7ab774b7e65-kube-api-access-crcrx\") pod \"neutron-85df7c4d7c-7ktz2\" (UID: \"03dac8b3-a92c-49b7-94cd-f7ab774b7e65\") " pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.629714 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.645333 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" event={"ID":"5984d04d-832a-4767-8966-96b3edc181e9","Type":"ContainerStarted","Data":"e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41"} Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.645698 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.647110 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d4d4f746b-b9w44" event={"ID":"ca660681-8c14-4632-91ee-9abeeb3ef48e","Type":"ContainerStarted","Data":"23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838"} Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.647716 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.657141 4715 generic.go:334] "Generic (PLEG): container finished" podID="2f4dee6e-f935-4bdd-9138-d414e86c0fa2" containerID="b8ed99a425aa9c369f9b95b06ae7d8299eff1ee29789b000aa1a34966e3d750b" exitCode=0 Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.657247 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jgbfr" event={"ID":"2f4dee6e-f935-4bdd-9138-d414e86c0fa2","Type":"ContainerDied","Data":"b8ed99a425aa9c369f9b95b06ae7d8299eff1ee29789b000aa1a34966e3d750b"} Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.660484 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l7r5w" event={"ID":"027df64c-f87d-401f-965c-88c874a854f8","Type":"ContainerStarted","Data":"371a8fa3d04735ed76bce22928354da20b59bb0f2ca96fad013668dab63567fd"} Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.679381 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" podStartSLOduration=3.679361607 podStartE2EDuration="3.679361607s" podCreationTimestamp="2025-10-09 08:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:12.675831524 +0000 UTC m=+1083.368635532" watchObservedRunningTime="2025-10-09 08:04:12.679361607 +0000 UTC m=+1083.372165635" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.701463 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5d4d4f746b-b9w44" podStartSLOduration=3.70143697 podStartE2EDuration="3.70143697s" podCreationTimestamp="2025-10-09 08:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:12.693432097 +0000 UTC m=+1083.386236115" watchObservedRunningTime="2025-10-09 08:04:12.70143697 +0000 UTC m=+1083.394240978" Oct 09 08:04:12 crc kubenswrapper[4715]: I1009 08:04:12.737538 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-l7r5w" podStartSLOduration=2.457728753 podStartE2EDuration="40.737514443s" podCreationTimestamp="2025-10-09 08:03:32 +0000 UTC" firstStartedPulling="2025-10-09 08:03:33.476255218 +0000 UTC m=+1044.169059226" lastFinishedPulling="2025-10-09 08:04:11.756040908 +0000 UTC m=+1082.448844916" observedRunningTime="2025-10-09 08:04:12.726092879 +0000 UTC m=+1083.418896897" watchObservedRunningTime="2025-10-09 08:04:12.737514443 +0000 UTC m=+1083.430318471" Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.258594 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-85df7c4d7c-7ktz2"] Oct 09 08:04:13 crc kubenswrapper[4715]: W1009 08:04:13.266367 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03dac8b3_a92c_49b7_94cd_f7ab774b7e65.slice/crio-6bcef67021c35bb6e50ee44cb356f75b15467af1ce572851a535830d9066296d WatchSource:0}: Error finding container 6bcef67021c35bb6e50ee44cb356f75b15467af1ce572851a535830d9066296d: Status 404 returned error can't find the container with id 6bcef67021c35bb6e50ee44cb356f75b15467af1ce572851a535830d9066296d Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.671015 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85df7c4d7c-7ktz2" event={"ID":"03dac8b3-a92c-49b7-94cd-f7ab774b7e65","Type":"ContainerStarted","Data":"5d8a797ce2ae6333ffec1f1afd85f91925b6a7d22fd20236813fab015230e3f1"} Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.671311 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85df7c4d7c-7ktz2" event={"ID":"03dac8b3-a92c-49b7-94cd-f7ab774b7e65","Type":"ContainerStarted","Data":"6bcef67021c35bb6e50ee44cb356f75b15467af1ce572851a535830d9066296d"} Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.934539 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.963412 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-scripts\") pod \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.963519 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq68s\" (UniqueName: \"kubernetes.io/projected/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-kube-api-access-kq68s\") pod \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.963583 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-combined-ca-bundle\") pod \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.963610 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-etc-machine-id\") pod \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.963781 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-db-sync-config-data\") pod \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.963871 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-config-data\") pod \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\" (UID: \"2f4dee6e-f935-4bdd-9138-d414e86c0fa2\") " Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.965624 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2f4dee6e-f935-4bdd-9138-d414e86c0fa2" (UID: "2f4dee6e-f935-4bdd-9138-d414e86c0fa2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.974231 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-kube-api-access-kq68s" (OuterVolumeSpecName: "kube-api-access-kq68s") pod "2f4dee6e-f935-4bdd-9138-d414e86c0fa2" (UID: "2f4dee6e-f935-4bdd-9138-d414e86c0fa2"). InnerVolumeSpecName "kube-api-access-kq68s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.974645 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-scripts" (OuterVolumeSpecName: "scripts") pod "2f4dee6e-f935-4bdd-9138-d414e86c0fa2" (UID: "2f4dee6e-f935-4bdd-9138-d414e86c0fa2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:13 crc kubenswrapper[4715]: I1009 08:04:13.975967 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2f4dee6e-f935-4bdd-9138-d414e86c0fa2" (UID: "2f4dee6e-f935-4bdd-9138-d414e86c0fa2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.005615 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f4dee6e-f935-4bdd-9138-d414e86c0fa2" (UID: "2f4dee6e-f935-4bdd-9138-d414e86c0fa2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.037579 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-config-data" (OuterVolumeSpecName: "config-data") pod "2f4dee6e-f935-4bdd-9138-d414e86c0fa2" (UID: "2f4dee6e-f935-4bdd-9138-d414e86c0fa2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.065935 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.065973 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.065983 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq68s\" (UniqueName: \"kubernetes.io/projected/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-kube-api-access-kq68s\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.065993 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.066002 4715 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.066010 4715 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2f4dee6e-f935-4bdd-9138-d414e86c0fa2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.317972 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-76cc5dfd5-dcg58" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.681325 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jgbfr" event={"ID":"2f4dee6e-f935-4bdd-9138-d414e86c0fa2","Type":"ContainerDied","Data":"16abcda551342f0e235ceaecbc302dfc8fc86dde87a0a6396a526a99c2ccb764"} Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.681944 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16abcda551342f0e235ceaecbc302dfc8fc86dde87a0a6396a526a99c2ccb764" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.681387 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jgbfr" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.683587 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-85df7c4d7c-7ktz2" event={"ID":"03dac8b3-a92c-49b7-94cd-f7ab774b7e65","Type":"ContainerStarted","Data":"f85ee68f7adc228663ce36cf64743c090ad939fd33ba00733a11fe35f3e59cf8"} Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.684042 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.755128 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-85df7c4d7c-7ktz2" podStartSLOduration=2.755108784 podStartE2EDuration="2.755108784s" podCreationTimestamp="2025-10-09 08:04:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:14.707222197 +0000 UTC m=+1085.400026225" watchObservedRunningTime="2025-10-09 08:04:14.755108784 +0000 UTC m=+1085.447912802" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.990178 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 08:04:14 crc kubenswrapper[4715]: E1009 08:04:14.991071 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4dee6e-f935-4bdd-9138-d414e86c0fa2" containerName="cinder-db-sync" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.991091 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4dee6e-f935-4bdd-9138-d414e86c0fa2" containerName="cinder-db-sync" Oct 09 08:04:14 crc kubenswrapper[4715]: I1009 08:04:14.991349 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f4dee6e-f935-4bdd-9138-d414e86c0fa2" containerName="cinder-db-sync" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:14.998803 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.004024 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-25xmn" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.004232 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.004404 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.004488 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.077019 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-ppstn"] Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.077699 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" podUID="5984d04d-832a-4767-8966-96b3edc181e9" containerName="dnsmasq-dns" containerID="cri-o://e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41" gracePeriod=10 Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.096722 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.129716 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-7pd5s"] Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.141335 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.152540 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-7pd5s"] Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.188008 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08ca7802-fc81-4258-bfef-c598c9d65b2f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.189398 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-scripts\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.190473 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.190496 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-config-data\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.190559 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxlfp\" (UniqueName: \"kubernetes.io/projected/08ca7802-fc81-4258-bfef-c598c9d65b2f-kube-api-access-dxlfp\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.190708 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.262167 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.264188 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.266244 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.273861 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.292161 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-dns-swift-storage-0\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.292294 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.292321 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-config-data\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.292353 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-config\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.292394 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxlfp\" (UniqueName: \"kubernetes.io/projected/08ca7802-fc81-4258-bfef-c598c9d65b2f-kube-api-access-dxlfp\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.292514 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4wwd\" (UniqueName: \"kubernetes.io/projected/33dae8b1-9611-43f9-8dc3-5bccd2694cca-kube-api-access-z4wwd\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.292572 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-ovsdbserver-nb\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.292626 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-ovsdbserver-sb\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.292655 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.292986 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08ca7802-fc81-4258-bfef-c598c9d65b2f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.293081 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-dns-svc\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.293151 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-scripts\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.296841 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08ca7802-fc81-4258-bfef-c598c9d65b2f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.301106 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-scripts\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.302056 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.322606 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-config-data\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.323255 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.325639 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxlfp\" (UniqueName: \"kubernetes.io/projected/08ca7802-fc81-4258-bfef-c598c9d65b2f-kube-api-access-dxlfp\") pod \"cinder-scheduler-0\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.331581 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395506 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395565 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4wwd\" (UniqueName: \"kubernetes.io/projected/33dae8b1-9611-43f9-8dc3-5bccd2694cca-kube-api-access-z4wwd\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395596 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4222\" (UniqueName: \"kubernetes.io/projected/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-kube-api-access-t4222\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395624 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-config-data\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395663 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-ovsdbserver-nb\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395694 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-ovsdbserver-sb\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395740 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395769 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-dns-svc\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395809 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-dns-swift-storage-0\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395850 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-scripts\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395880 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-config\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395897 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-config-data-custom\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.395921 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-logs\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.396988 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-ovsdbserver-sb\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.397215 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-dns-svc\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.397899 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-dns-swift-storage-0\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.397381 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-config\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.398438 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-ovsdbserver-nb\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.422162 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4wwd\" (UniqueName: \"kubernetes.io/projected/33dae8b1-9611-43f9-8dc3-5bccd2694cca-kube-api-access-z4wwd\") pod \"dnsmasq-dns-d68b9cb4c-7pd5s\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.472379 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.496904 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.496987 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-scripts\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.497023 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-config-data-custom\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.497051 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-logs\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.497105 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.497128 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4222\" (UniqueName: \"kubernetes.io/projected/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-kube-api-access-t4222\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.497147 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-config-data\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.504302 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-logs\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.505110 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-config-data\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.505188 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.511949 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-scripts\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.512563 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-config-data-custom\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.521262 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.530855 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4222\" (UniqueName: \"kubernetes.io/projected/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-kube-api-access-t4222\") pod \"cinder-api-0\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.588854 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.629586 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.709467 4715 generic.go:334] "Generic (PLEG): container finished" podID="027df64c-f87d-401f-965c-88c874a854f8" containerID="371a8fa3d04735ed76bce22928354da20b59bb0f2ca96fad013668dab63567fd" exitCode=0 Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.709524 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l7r5w" event={"ID":"027df64c-f87d-401f-965c-88c874a854f8","Type":"ContainerDied","Data":"371a8fa3d04735ed76bce22928354da20b59bb0f2ca96fad013668dab63567fd"} Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.716597 4715 generic.go:334] "Generic (PLEG): container finished" podID="5984d04d-832a-4767-8966-96b3edc181e9" containerID="e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41" exitCode=0 Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.716649 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" event={"ID":"5984d04d-832a-4767-8966-96b3edc181e9","Type":"ContainerDied","Data":"e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41"} Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.716898 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" event={"ID":"5984d04d-832a-4767-8966-96b3edc181e9","Type":"ContainerDied","Data":"4431672a85c02f9340afdfb5e61ecea8f68bb7f028fb30d17171ad3b60cadd89"} Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.717020 4715 scope.go:117] "RemoveContainer" containerID="e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.717301 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84b966f6c9-ppstn" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.813190 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-ovsdbserver-sb\") pod \"5984d04d-832a-4767-8966-96b3edc181e9\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.813268 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-ovsdbserver-nb\") pod \"5984d04d-832a-4767-8966-96b3edc181e9\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.813340 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-dns-svc\") pod \"5984d04d-832a-4767-8966-96b3edc181e9\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.813368 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-dns-swift-storage-0\") pod \"5984d04d-832a-4767-8966-96b3edc181e9\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.813393 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwxb9\" (UniqueName: \"kubernetes.io/projected/5984d04d-832a-4767-8966-96b3edc181e9-kube-api-access-qwxb9\") pod \"5984d04d-832a-4767-8966-96b3edc181e9\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.813531 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-config\") pod \"5984d04d-832a-4767-8966-96b3edc181e9\" (UID: \"5984d04d-832a-4767-8966-96b3edc181e9\") " Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.827109 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5984d04d-832a-4767-8966-96b3edc181e9-kube-api-access-qwxb9" (OuterVolumeSpecName: "kube-api-access-qwxb9") pod "5984d04d-832a-4767-8966-96b3edc181e9" (UID: "5984d04d-832a-4767-8966-96b3edc181e9"). InnerVolumeSpecName "kube-api-access-qwxb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.845989 4715 scope.go:117] "RemoveContainer" containerID="203bb0639160619326bf2a1463ea16d0e262f215c0bd217c742e41fc1341a8b9" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.889050 4715 scope.go:117] "RemoveContainer" containerID="e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41" Oct 09 08:04:15 crc kubenswrapper[4715]: E1009 08:04:15.890807 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41\": container with ID starting with e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41 not found: ID does not exist" containerID="e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.890844 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41"} err="failed to get container status \"e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41\": rpc error: code = NotFound desc = could not find container \"e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41\": container with ID starting with e82ec37ec5d083dd33eedb0e1531f34e5552e22ebd11241861ea95aa708e6d41 not found: ID does not exist" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.890882 4715 scope.go:117] "RemoveContainer" containerID="203bb0639160619326bf2a1463ea16d0e262f215c0bd217c742e41fc1341a8b9" Oct 09 08:04:15 crc kubenswrapper[4715]: E1009 08:04:15.897103 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"203bb0639160619326bf2a1463ea16d0e262f215c0bd217c742e41fc1341a8b9\": container with ID starting with 203bb0639160619326bf2a1463ea16d0e262f215c0bd217c742e41fc1341a8b9 not found: ID does not exist" containerID="203bb0639160619326bf2a1463ea16d0e262f215c0bd217c742e41fc1341a8b9" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.897149 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"203bb0639160619326bf2a1463ea16d0e262f215c0bd217c742e41fc1341a8b9"} err="failed to get container status \"203bb0639160619326bf2a1463ea16d0e262f215c0bd217c742e41fc1341a8b9\": rpc error: code = NotFound desc = could not find container \"203bb0639160619326bf2a1463ea16d0e262f215c0bd217c742e41fc1341a8b9\": container with ID starting with 203bb0639160619326bf2a1463ea16d0e262f215c0bd217c742e41fc1341a8b9 not found: ID does not exist" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.914562 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5984d04d-832a-4767-8966-96b3edc181e9" (UID: "5984d04d-832a-4767-8966-96b3edc181e9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.917023 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5984d04d-832a-4767-8966-96b3edc181e9" (UID: "5984d04d-832a-4767-8966-96b3edc181e9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.920705 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.920732 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.920741 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwxb9\" (UniqueName: \"kubernetes.io/projected/5984d04d-832a-4767-8966-96b3edc181e9-kube-api-access-qwxb9\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.928668 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-config" (OuterVolumeSpecName: "config") pod "5984d04d-832a-4767-8966-96b3edc181e9" (UID: "5984d04d-832a-4767-8966-96b3edc181e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.942323 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5984d04d-832a-4767-8966-96b3edc181e9" (UID: "5984d04d-832a-4767-8966-96b3edc181e9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.946593 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 08:04:15 crc kubenswrapper[4715]: I1009 08:04:15.950252 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5984d04d-832a-4767-8966-96b3edc181e9" (UID: "5984d04d-832a-4767-8966-96b3edc181e9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.029158 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.029193 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.029206 4715 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5984d04d-832a-4767-8966-96b3edc181e9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.075495 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-ppstn"] Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.086682 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84b966f6c9-ppstn"] Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.104742 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-7pd5s"] Oct 09 08:04:16 crc kubenswrapper[4715]: W1009 08:04:16.112840 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33dae8b1_9611_43f9_8dc3_5bccd2694cca.slice/crio-8a3e0d7f43cf00898b5adf8efbd098e2769b0962688b84b9e6854d10c2a925db WatchSource:0}: Error finding container 8a3e0d7f43cf00898b5adf8efbd098e2769b0962688b84b9e6854d10c2a925db: Status 404 returned error can't find the container with id 8a3e0d7f43cf00898b5adf8efbd098e2769b0962688b84b9e6854d10c2a925db Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.148242 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5984d04d-832a-4767-8966-96b3edc181e9" path="/var/lib/kubelet/pods/5984d04d-832a-4767-8966-96b3edc181e9/volumes" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.179913 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 09 08:04:16 crc kubenswrapper[4715]: E1009 08:04:16.180364 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5984d04d-832a-4767-8966-96b3edc181e9" containerName="dnsmasq-dns" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.180386 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5984d04d-832a-4767-8966-96b3edc181e9" containerName="dnsmasq-dns" Oct 09 08:04:16 crc kubenswrapper[4715]: E1009 08:04:16.180407 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5984d04d-832a-4767-8966-96b3edc181e9" containerName="init" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.180415 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5984d04d-832a-4767-8966-96b3edc181e9" containerName="init" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.180595 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="5984d04d-832a-4767-8966-96b3edc181e9" containerName="dnsmasq-dns" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.181143 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.199165 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.199691 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-77csk" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.199912 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.210555 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.241010 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea37012b-c593-4cd0-8501-121c791b2741-openstack-config\") pod \"openstackclient\" (UID: \"ea37012b-c593-4cd0-8501-121c791b2741\") " pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.241095 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea37012b-c593-4cd0-8501-121c791b2741-openstack-config-secret\") pod \"openstackclient\" (UID: \"ea37012b-c593-4cd0-8501-121c791b2741\") " pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.242224 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnvcq\" (UniqueName: \"kubernetes.io/projected/ea37012b-c593-4cd0-8501-121c791b2741-kube-api-access-fnvcq\") pod \"openstackclient\" (UID: \"ea37012b-c593-4cd0-8501-121c791b2741\") " pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.242337 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea37012b-c593-4cd0-8501-121c791b2741-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ea37012b-c593-4cd0-8501-121c791b2741\") " pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.293555 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.344009 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea37012b-c593-4cd0-8501-121c791b2741-openstack-config-secret\") pod \"openstackclient\" (UID: \"ea37012b-c593-4cd0-8501-121c791b2741\") " pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.344112 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnvcq\" (UniqueName: \"kubernetes.io/projected/ea37012b-c593-4cd0-8501-121c791b2741-kube-api-access-fnvcq\") pod \"openstackclient\" (UID: \"ea37012b-c593-4cd0-8501-121c791b2741\") " pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.344135 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea37012b-c593-4cd0-8501-121c791b2741-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ea37012b-c593-4cd0-8501-121c791b2741\") " pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.344196 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea37012b-c593-4cd0-8501-121c791b2741-openstack-config\") pod \"openstackclient\" (UID: \"ea37012b-c593-4cd0-8501-121c791b2741\") " pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.344935 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ea37012b-c593-4cd0-8501-121c791b2741-openstack-config\") pod \"openstackclient\" (UID: \"ea37012b-c593-4cd0-8501-121c791b2741\") " pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.351765 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea37012b-c593-4cd0-8501-121c791b2741-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ea37012b-c593-4cd0-8501-121c791b2741\") " pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.351822 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ea37012b-c593-4cd0-8501-121c791b2741-openstack-config-secret\") pod \"openstackclient\" (UID: \"ea37012b-c593-4cd0-8501-121c791b2741\") " pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.365631 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnvcq\" (UniqueName: \"kubernetes.io/projected/ea37012b-c593-4cd0-8501-121c791b2741-kube-api-access-fnvcq\") pod \"openstackclient\" (UID: \"ea37012b-c593-4cd0-8501-121c791b2741\") " pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.539891 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.738046 4715 generic.go:334] "Generic (PLEG): container finished" podID="33dae8b1-9611-43f9-8dc3-5bccd2694cca" containerID="446de50489e25be996e97280afffa0f71d04e7c14c0e8f53f3e27441a2a21134" exitCode=0 Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.738231 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" event={"ID":"33dae8b1-9611-43f9-8dc3-5bccd2694cca","Type":"ContainerDied","Data":"446de50489e25be996e97280afffa0f71d04e7c14c0e8f53f3e27441a2a21134"} Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.738330 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" event={"ID":"33dae8b1-9611-43f9-8dc3-5bccd2694cca","Type":"ContainerStarted","Data":"8a3e0d7f43cf00898b5adf8efbd098e2769b0962688b84b9e6854d10c2a925db"} Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.751540 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1","Type":"ContainerStarted","Data":"3adafb442cff98b91fcb7ceca209213035502c2c300f876e71bfa496793fe201"} Oct 09 08:04:16 crc kubenswrapper[4715]: I1009 08:04:16.760582 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"08ca7802-fc81-4258-bfef-c598c9d65b2f","Type":"ContainerStarted","Data":"48b1600f361bb4d0b91a695104162349d982aea78c6e3b8e5d68fd50a8ef86fa"} Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.099188 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.185601 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.263266 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/027df64c-f87d-401f-965c-88c874a854f8-db-sync-config-data\") pod \"027df64c-f87d-401f-965c-88c874a854f8\" (UID: \"027df64c-f87d-401f-965c-88c874a854f8\") " Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.264564 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcjd4\" (UniqueName: \"kubernetes.io/projected/027df64c-f87d-401f-965c-88c874a854f8-kube-api-access-kcjd4\") pod \"027df64c-f87d-401f-965c-88c874a854f8\" (UID: \"027df64c-f87d-401f-965c-88c874a854f8\") " Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.264586 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/027df64c-f87d-401f-965c-88c874a854f8-combined-ca-bundle\") pod \"027df64c-f87d-401f-965c-88c874a854f8\" (UID: \"027df64c-f87d-401f-965c-88c874a854f8\") " Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.273931 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/027df64c-f87d-401f-965c-88c874a854f8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "027df64c-f87d-401f-965c-88c874a854f8" (UID: "027df64c-f87d-401f-965c-88c874a854f8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.291549 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.300949 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/027df64c-f87d-401f-965c-88c874a854f8-kube-api-access-kcjd4" (OuterVolumeSpecName: "kube-api-access-kcjd4") pod "027df64c-f87d-401f-965c-88c874a854f8" (UID: "027df64c-f87d-401f-965c-88c874a854f8"). InnerVolumeSpecName "kube-api-access-kcjd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.359754 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/027df64c-f87d-401f-965c-88c874a854f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "027df64c-f87d-401f-965c-88c874a854f8" (UID: "027df64c-f87d-401f-965c-88c874a854f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.372190 4715 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/027df64c-f87d-401f-965c-88c874a854f8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.372219 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcjd4\" (UniqueName: \"kubernetes.io/projected/027df64c-f87d-401f-965c-88c874a854f8-kube-api-access-kcjd4\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.372231 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/027df64c-f87d-401f-965c-88c874a854f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.790307 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1","Type":"ContainerStarted","Data":"95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949"} Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.792003 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ea37012b-c593-4cd0-8501-121c791b2741","Type":"ContainerStarted","Data":"0001d081869930b56c951f428c5d30d47601a02edfa8e12ffa3e4af21654ebb4"} Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.795018 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l7r5w" event={"ID":"027df64c-f87d-401f-965c-88c874a854f8","Type":"ContainerDied","Data":"a98fe117ed6bbf1e633017e7b9dfff0e2865365b0d98c8774243153238fc8771"} Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.795061 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a98fe117ed6bbf1e633017e7b9dfff0e2865365b0d98c8774243153238fc8771" Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.795124 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l7r5w" Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.805967 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" event={"ID":"33dae8b1-9611-43f9-8dc3-5bccd2694cca","Type":"ContainerStarted","Data":"684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283"} Oct 09 08:04:17 crc kubenswrapper[4715]: I1009 08:04:17.806293 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.059862 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" podStartSLOduration=3.059829295 podStartE2EDuration="3.059829295s" podCreationTimestamp="2025-10-09 08:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:17.827582511 +0000 UTC m=+1088.520386539" watchObservedRunningTime="2025-10-09 08:04:18.059829295 +0000 UTC m=+1088.752633303" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.071373 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6ff4c65c6c-97dbp"] Oct 09 08:04:18 crc kubenswrapper[4715]: E1009 08:04:18.071791 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="027df64c-f87d-401f-965c-88c874a854f8" containerName="barbican-db-sync" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.071808 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="027df64c-f87d-401f-965c-88c874a854f8" containerName="barbican-db-sync" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.071967 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="027df64c-f87d-401f-965c-88c874a854f8" containerName="barbican-db-sync" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.072948 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.088061 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.088533 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ffhx7" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.088873 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.111562 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6ff4c65c6c-97dbp"] Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.177685 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-58cfd869c4-4djjf"] Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.178978 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.182729 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.191738 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d84057b-c735-4df6-a20d-ef88cccb44fe-config-data-custom\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.191784 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d84057b-c735-4df6-a20d-ef88cccb44fe-logs\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.191849 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d84057b-c735-4df6-a20d-ef88cccb44fe-combined-ca-bundle\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.191870 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d84057b-c735-4df6-a20d-ef88cccb44fe-config-data\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.191924 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtdzr\" (UniqueName: \"kubernetes.io/projected/6d84057b-c735-4df6-a20d-ef88cccb44fe-kube-api-access-rtdzr\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.211712 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58cfd869c4-4djjf"] Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.244534 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-7pd5s"] Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.293333 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d84057b-c735-4df6-a20d-ef88cccb44fe-logs\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.293443 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50cf187d-781d-49b7-840b-8dfc3366135f-config-data-custom\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.293468 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50cf187d-781d-49b7-840b-8dfc3366135f-config-data\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.293515 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d84057b-c735-4df6-a20d-ef88cccb44fe-combined-ca-bundle\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.293536 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d84057b-c735-4df6-a20d-ef88cccb44fe-config-data\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.293553 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z66w2\" (UniqueName: \"kubernetes.io/projected/50cf187d-781d-49b7-840b-8dfc3366135f-kube-api-access-z66w2\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.293635 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50cf187d-781d-49b7-840b-8dfc3366135f-logs\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.293672 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtdzr\" (UniqueName: \"kubernetes.io/projected/6d84057b-c735-4df6-a20d-ef88cccb44fe-kube-api-access-rtdzr\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.293713 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50cf187d-781d-49b7-840b-8dfc3366135f-combined-ca-bundle\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.293734 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d84057b-c735-4df6-a20d-ef88cccb44fe-config-data-custom\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.294370 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d84057b-c735-4df6-a20d-ef88cccb44fe-logs\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.304503 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-59756554bd-9q7xp"] Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.309995 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d84057b-c735-4df6-a20d-ef88cccb44fe-combined-ca-bundle\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.320079 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d84057b-c735-4df6-a20d-ef88cccb44fe-config-data-custom\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.327148 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-5kjqv"] Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.332588 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtdzr\" (UniqueName: \"kubernetes.io/projected/6d84057b-c735-4df6-a20d-ef88cccb44fe-kube-api-access-rtdzr\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.333684 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.336310 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.336818 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d84057b-c735-4df6-a20d-ef88cccb44fe-config-data\") pod \"barbican-worker-6ff4c65c6c-97dbp\" (UID: \"6d84057b-c735-4df6-a20d-ef88cccb44fe\") " pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.364761 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59756554bd-9q7xp"] Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.364808 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-5kjqv"] Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.364939 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.401029 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z66w2\" (UniqueName: \"kubernetes.io/projected/50cf187d-781d-49b7-840b-8dfc3366135f-kube-api-access-z66w2\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.401097 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5823462b-a07e-4525-9be8-370dce870498-logs\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.401130 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50cf187d-781d-49b7-840b-8dfc3366135f-logs\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.401164 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-config-data-custom\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.401212 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-combined-ca-bundle\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.401240 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50cf187d-781d-49b7-840b-8dfc3366135f-combined-ca-bundle\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.401298 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgpvz\" (UniqueName: \"kubernetes.io/projected/5823462b-a07e-4525-9be8-370dce870498-kube-api-access-pgpvz\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.401330 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-config-data\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.401358 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50cf187d-781d-49b7-840b-8dfc3366135f-config-data-custom\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.401376 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50cf187d-781d-49b7-840b-8dfc3366135f-config-data\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.408403 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50cf187d-781d-49b7-840b-8dfc3366135f-logs\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.416495 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50cf187d-781d-49b7-840b-8dfc3366135f-combined-ca-bundle\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.424821 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50cf187d-781d-49b7-840b-8dfc3366135f-config-data\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.424903 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50cf187d-781d-49b7-840b-8dfc3366135f-config-data-custom\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.434855 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z66w2\" (UniqueName: \"kubernetes.io/projected/50cf187d-781d-49b7-840b-8dfc3366135f-kube-api-access-z66w2\") pod \"barbican-keystone-listener-58cfd869c4-4djjf\" (UID: \"50cf187d-781d-49b7-840b-8dfc3366135f\") " pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.457394 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ff4c65c6c-97dbp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.505565 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.505623 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-config-data-custom\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.505654 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.505682 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-combined-ca-bundle\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.505727 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-dns-svc\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.505766 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-config\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.505786 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgpvz\" (UniqueName: \"kubernetes.io/projected/5823462b-a07e-4525-9be8-370dce870498-kube-api-access-pgpvz\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.505805 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-config-data\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.505848 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmvcn\" (UniqueName: \"kubernetes.io/projected/37af9a61-ef4d-476b-978d-ca780888d042-kube-api-access-fmvcn\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.505935 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5823462b-a07e-4525-9be8-370dce870498-logs\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.505954 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.510148 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5823462b-a07e-4525-9be8-370dce870498-logs\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.517903 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-config-data-custom\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.523512 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-combined-ca-bundle\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.523887 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.528314 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-config-data\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.533199 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgpvz\" (UniqueName: \"kubernetes.io/projected/5823462b-a07e-4525-9be8-370dce870498-kube-api-access-pgpvz\") pod \"barbican-api-59756554bd-9q7xp\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.559115 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.615996 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-config\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.616036 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmvcn\" (UniqueName: \"kubernetes.io/projected/37af9a61-ef4d-476b-978d-ca780888d042-kube-api-access-fmvcn\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.616124 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.616160 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.616191 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.616232 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-dns-svc\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.617114 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-config\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.621184 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.622486 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-dns-svc\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.622572 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.623856 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.636160 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmvcn\" (UniqueName: \"kubernetes.io/projected/37af9a61-ef4d-476b-978d-ca780888d042-kube-api-access-fmvcn\") pod \"dnsmasq-dns-5784cf869f-5kjqv\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.860510 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1","Type":"ContainerStarted","Data":"94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba"} Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.860960 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" containerName="cinder-api-log" containerID="cri-o://95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949" gracePeriod=30 Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.861055 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" containerName="cinder-api" containerID="cri-o://94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba" gracePeriod=30 Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.861292 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.872866 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.873937 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"08ca7802-fc81-4258-bfef-c598c9d65b2f","Type":"ContainerStarted","Data":"0dafd83d1e12bb3fb83b77412a258914b924b61fd2d3081e52340af48f99d474"} Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.887239 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.887213844 podStartE2EDuration="3.887213844s" podCreationTimestamp="2025-10-09 08:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:18.885543455 +0000 UTC m=+1089.578347493" watchObservedRunningTime="2025-10-09 08:04:18.887213844 +0000 UTC m=+1089.580017852" Oct 09 08:04:18 crc kubenswrapper[4715]: I1009 08:04:18.900204 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d9885b95b-r2cb2" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.106946 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-58cfd869c4-4djjf"] Oct 09 08:04:19 crc kubenswrapper[4715]: W1009 08:04:19.118957 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50cf187d_781d_49b7_840b_8dfc3366135f.slice/crio-fd6abd18fe1575c1845692244ffc26b737ba5e049fb3d6543f6f965460a1aabb WatchSource:0}: Error finding container fd6abd18fe1575c1845692244ffc26b737ba5e049fb3d6543f6f965460a1aabb: Status 404 returned error can't find the container with id fd6abd18fe1575c1845692244ffc26b737ba5e049fb3d6543f6f965460a1aabb Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.121102 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6ff4c65c6c-97dbp"] Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.316931 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59756554bd-9q7xp"] Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.498993 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-5kjqv"] Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.857878 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.916574 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ff4c65c6c-97dbp" event={"ID":"6d84057b-c735-4df6-a20d-ef88cccb44fe","Type":"ContainerStarted","Data":"b009eb247977c02c215344ee36d5dafe6314bba9bc885482b1af87f6cbf5ca43"} Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.921919 4715 generic.go:334] "Generic (PLEG): container finished" podID="bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" containerID="94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba" exitCode=0 Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.921948 4715 generic.go:334] "Generic (PLEG): container finished" podID="bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" containerID="95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949" exitCode=143 Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.921991 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1","Type":"ContainerDied","Data":"94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba"} Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.922018 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1","Type":"ContainerDied","Data":"95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949"} Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.922028 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1","Type":"ContainerDied","Data":"3adafb442cff98b91fcb7ceca209213035502c2c300f876e71bfa496793fe201"} Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.922043 4715 scope.go:117] "RemoveContainer" containerID="94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba" Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.922150 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.937099 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"08ca7802-fc81-4258-bfef-c598c9d65b2f","Type":"ContainerStarted","Data":"a29fbe68284573f78517485f7f15db808ab704e8651398c128ad76625bfd0d10"} Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.941737 4715 generic.go:334] "Generic (PLEG): container finished" podID="37af9a61-ef4d-476b-978d-ca780888d042" containerID="fc614e5e991ca86933f54913c851b597dbc1cbfde4ce77363a916f0b909fe727" exitCode=0 Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.941828 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" event={"ID":"37af9a61-ef4d-476b-978d-ca780888d042","Type":"ContainerDied","Data":"fc614e5e991ca86933f54913c851b597dbc1cbfde4ce77363a916f0b909fe727"} Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.941870 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" event={"ID":"37af9a61-ef4d-476b-978d-ca780888d042","Type":"ContainerStarted","Data":"bc056ea56d8632c6e21762e19a1d549401cb76159772e85196da8a5db2a320bb"} Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.944141 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59756554bd-9q7xp" event={"ID":"5823462b-a07e-4525-9be8-370dce870498","Type":"ContainerStarted","Data":"16fdac3fb6c5d655e385897f80db5d30b67e106a81ba59d1fd09b8e88ae2486d"} Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.944171 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59756554bd-9q7xp" event={"ID":"5823462b-a07e-4525-9be8-370dce870498","Type":"ContainerStarted","Data":"b330576b3c5059ffffc82bc01fcecf1fa46805e23de4eacfcaa2cec2d4177551"} Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.945608 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" event={"ID":"50cf187d-781d-49b7-840b-8dfc3366135f","Type":"ContainerStarted","Data":"fd6abd18fe1575c1845692244ffc26b737ba5e049fb3d6543f6f965460a1aabb"} Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.945769 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" podUID="33dae8b1-9611-43f9-8dc3-5bccd2694cca" containerName="dnsmasq-dns" containerID="cri-o://684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283" gracePeriod=10 Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.959879 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4222\" (UniqueName: \"kubernetes.io/projected/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-kube-api-access-t4222\") pod \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.960014 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-config-data-custom\") pod \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.960038 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-combined-ca-bundle\") pod \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.960059 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-config-data\") pod \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.960085 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-scripts\") pod \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.960151 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-logs\") pod \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.960325 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-etc-machine-id\") pod \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\" (UID: \"bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1\") " Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.960812 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" (UID: "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.961019 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.7766361790000005 podStartE2EDuration="5.96100064s" podCreationTimestamp="2025-10-09 08:04:14 +0000 UTC" firstStartedPulling="2025-10-09 08:04:15.967953686 +0000 UTC m=+1086.660757694" lastFinishedPulling="2025-10-09 08:04:17.152318147 +0000 UTC m=+1087.845122155" observedRunningTime="2025-10-09 08:04:19.951887535 +0000 UTC m=+1090.644691543" watchObservedRunningTime="2025-10-09 08:04:19.96100064 +0000 UTC m=+1090.653804648" Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.964792 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-logs" (OuterVolumeSpecName: "logs") pod "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" (UID: "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.969618 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-scripts" (OuterVolumeSpecName: "scripts") pod "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" (UID: "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.970690 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-kube-api-access-t4222" (OuterVolumeSpecName: "kube-api-access-t4222") pod "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" (UID: "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1"). InnerVolumeSpecName "kube-api-access-t4222". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.974610 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" (UID: "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:19 crc kubenswrapper[4715]: I1009 08:04:19.974628 4715 scope.go:117] "RemoveContainer" containerID="95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.031656 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" (UID: "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.062281 4715 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.062328 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4222\" (UniqueName: \"kubernetes.io/projected/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-kube-api-access-t4222\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.062343 4715 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.062356 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.062369 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.062381 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.066134 4715 scope.go:117] "RemoveContainer" containerID="94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba" Oct 09 08:04:20 crc kubenswrapper[4715]: E1009 08:04:20.067184 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba\": container with ID starting with 94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba not found: ID does not exist" containerID="94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.067266 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba"} err="failed to get container status \"94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba\": rpc error: code = NotFound desc = could not find container \"94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba\": container with ID starting with 94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba not found: ID does not exist" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.067297 4715 scope.go:117] "RemoveContainer" containerID="95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949" Oct 09 08:04:20 crc kubenswrapper[4715]: E1009 08:04:20.069021 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949\": container with ID starting with 95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949 not found: ID does not exist" containerID="95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.069122 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949"} err="failed to get container status \"95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949\": rpc error: code = NotFound desc = could not find container \"95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949\": container with ID starting with 95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949 not found: ID does not exist" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.069159 4715 scope.go:117] "RemoveContainer" containerID="94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.069604 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba"} err="failed to get container status \"94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba\": rpc error: code = NotFound desc = could not find container \"94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba\": container with ID starting with 94d8dacc662eaa295db01efc390e226d3170428c66e139e3f6d7bce7507f19ba not found: ID does not exist" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.069654 4715 scope.go:117] "RemoveContainer" containerID="95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.073538 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949"} err="failed to get container status \"95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949\": rpc error: code = NotFound desc = could not find container \"95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949\": container with ID starting with 95c60e2321159c965ec846f2cada7cf74116865a2c66290d82da81be8d174949 not found: ID does not exist" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.074624 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-config-data" (OuterVolumeSpecName: "config-data") pod "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" (UID: "bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.163790 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.260736 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.262961 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.298746 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 09 08:04:20 crc kubenswrapper[4715]: E1009 08:04:20.303494 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" containerName="cinder-api-log" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.303523 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" containerName="cinder-api-log" Oct 09 08:04:20 crc kubenswrapper[4715]: E1009 08:04:20.303554 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" containerName="cinder-api" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.303560 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" containerName="cinder-api" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.303791 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" containerName="cinder-api-log" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.303823 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" containerName="cinder-api" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.304916 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.314930 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.322620 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.323729 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.324122 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.333554 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.477557 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-scripts\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.477632 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50651334-64c9-4214-9c7b-c10c4152d053-etc-machine-id\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.477663 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.477714 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-public-tls-certs\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.477755 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50651334-64c9-4214-9c7b-c10c4152d053-logs\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.477786 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-config-data\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.477818 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.477846 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54f5w\" (UniqueName: \"kubernetes.io/projected/50651334-64c9-4214-9c7b-c10c4152d053-kube-api-access-54f5w\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.481481 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-config-data-custom\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.576527 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.583229 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50651334-64c9-4214-9c7b-c10c4152d053-logs\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.583281 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-config-data\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.583312 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.583332 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54f5w\" (UniqueName: \"kubernetes.io/projected/50651334-64c9-4214-9c7b-c10c4152d053-kube-api-access-54f5w\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.583357 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-config-data-custom\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.583429 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-scripts\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.583458 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50651334-64c9-4214-9c7b-c10c4152d053-etc-machine-id\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.583482 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.583520 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-public-tls-certs\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.585804 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/50651334-64c9-4214-9c7b-c10c4152d053-etc-machine-id\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.586054 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50651334-64c9-4214-9c7b-c10c4152d053-logs\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.590367 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-public-tls-certs\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.591797 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.591856 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-config-data-custom\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.592488 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-scripts\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.595215 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-config-data\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.598796 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/50651334-64c9-4214-9c7b-c10c4152d053-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.608743 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54f5w\" (UniqueName: \"kubernetes.io/projected/50651334-64c9-4214-9c7b-c10c4152d053-kube-api-access-54f5w\") pod \"cinder-api-0\" (UID: \"50651334-64c9-4214-9c7b-c10c4152d053\") " pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.685035 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-dns-swift-storage-0\") pod \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.685146 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-config\") pod \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.685235 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-ovsdbserver-nb\") pod \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.685263 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-ovsdbserver-sb\") pod \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.685300 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-dns-svc\") pod \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.685327 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4wwd\" (UniqueName: \"kubernetes.io/projected/33dae8b1-9611-43f9-8dc3-5bccd2694cca-kube-api-access-z4wwd\") pod \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\" (UID: \"33dae8b1-9611-43f9-8dc3-5bccd2694cca\") " Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.705739 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33dae8b1-9611-43f9-8dc3-5bccd2694cca-kube-api-access-z4wwd" (OuterVolumeSpecName: "kube-api-access-z4wwd") pod "33dae8b1-9611-43f9-8dc3-5bccd2694cca" (UID: "33dae8b1-9611-43f9-8dc3-5bccd2694cca"). InnerVolumeSpecName "kube-api-access-z4wwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.777056 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33dae8b1-9611-43f9-8dc3-5bccd2694cca" (UID: "33dae8b1-9611-43f9-8dc3-5bccd2694cca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.789144 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.789170 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4wwd\" (UniqueName: \"kubernetes.io/projected/33dae8b1-9611-43f9-8dc3-5bccd2694cca-kube-api-access-z4wwd\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.799973 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "33dae8b1-9611-43f9-8dc3-5bccd2694cca" (UID: "33dae8b1-9611-43f9-8dc3-5bccd2694cca"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.800045 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33dae8b1-9611-43f9-8dc3-5bccd2694cca" (UID: "33dae8b1-9611-43f9-8dc3-5bccd2694cca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.802198 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-config" (OuterVolumeSpecName: "config") pod "33dae8b1-9611-43f9-8dc3-5bccd2694cca" (UID: "33dae8b1-9611-43f9-8dc3-5bccd2694cca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.813195 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33dae8b1-9611-43f9-8dc3-5bccd2694cca" (UID: "33dae8b1-9611-43f9-8dc3-5bccd2694cca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.866301 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.890901 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.890933 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.890944 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.890953 4715 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33dae8b1-9611-43f9-8dc3-5bccd2694cca-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.965972 4715 generic.go:334] "Generic (PLEG): container finished" podID="33dae8b1-9611-43f9-8dc3-5bccd2694cca" containerID="684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283" exitCode=0 Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.966048 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" event={"ID":"33dae8b1-9611-43f9-8dc3-5bccd2694cca","Type":"ContainerDied","Data":"684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283"} Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.966084 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" event={"ID":"33dae8b1-9611-43f9-8dc3-5bccd2694cca","Type":"ContainerDied","Data":"8a3e0d7f43cf00898b5adf8efbd098e2769b0962688b84b9e6854d10c2a925db"} Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.966105 4715 scope.go:117] "RemoveContainer" containerID="684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.966267 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d68b9cb4c-7pd5s" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.977174 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" event={"ID":"37af9a61-ef4d-476b-978d-ca780888d042","Type":"ContainerStarted","Data":"edc3751b9ca29e525c3cfbf9387b6fcbcca2c661e93118c7132634c8144fb48c"} Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.977305 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.986363 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59756554bd-9q7xp" event={"ID":"5823462b-a07e-4525-9be8-370dce870498","Type":"ContainerStarted","Data":"ac6032261715516b1ba1cfe9851948275ebc4124882ad0151121b557b399837a"} Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.986435 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.986450 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:20 crc kubenswrapper[4715]: I1009 08:04:20.998188 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" podStartSLOduration=2.998168589 podStartE2EDuration="2.998168589s" podCreationTimestamp="2025-10-09 08:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:20.997415937 +0000 UTC m=+1091.690219945" watchObservedRunningTime="2025-10-09 08:04:20.998168589 +0000 UTC m=+1091.690972597" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.023841 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-59756554bd-9q7xp" podStartSLOduration=3.023815917 podStartE2EDuration="3.023815917s" podCreationTimestamp="2025-10-09 08:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:21.020052937 +0000 UTC m=+1091.712856945" watchObservedRunningTime="2025-10-09 08:04:21.023815917 +0000 UTC m=+1091.716619925" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.062222 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-7pd5s"] Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.077737 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d68b9cb4c-7pd5s"] Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.183166 4715 scope.go:117] "RemoveContainer" containerID="446de50489e25be996e97280afffa0f71d04e7c14c0e8f53f3e27441a2a21134" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.487381 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-67f46ddd46-qcgrk"] Oct 09 08:04:21 crc kubenswrapper[4715]: E1009 08:04:21.494751 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33dae8b1-9611-43f9-8dc3-5bccd2694cca" containerName="init" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.494784 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="33dae8b1-9611-43f9-8dc3-5bccd2694cca" containerName="init" Oct 09 08:04:21 crc kubenswrapper[4715]: E1009 08:04:21.494812 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33dae8b1-9611-43f9-8dc3-5bccd2694cca" containerName="dnsmasq-dns" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.494818 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="33dae8b1-9611-43f9-8dc3-5bccd2694cca" containerName="dnsmasq-dns" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.494999 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="33dae8b1-9611-43f9-8dc3-5bccd2694cca" containerName="dnsmasq-dns" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.498187 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.500547 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.500828 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.502032 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-67f46ddd46-qcgrk"] Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.606754 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-public-tls-certs\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.606816 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb469864-9053-4551-bca7-f3b67a20bf52-logs\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.606860 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-config-data-custom\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.606947 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-config-data\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.606963 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-internal-tls-certs\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.607016 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-589r7\" (UniqueName: \"kubernetes.io/projected/cb469864-9053-4551-bca7-f3b67a20bf52-kube-api-access-589r7\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.607042 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-combined-ca-bundle\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.610097 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.708614 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-589r7\" (UniqueName: \"kubernetes.io/projected/cb469864-9053-4551-bca7-f3b67a20bf52-kube-api-access-589r7\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.708695 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-combined-ca-bundle\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.708782 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-public-tls-certs\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.708830 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb469864-9053-4551-bca7-f3b67a20bf52-logs\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.708891 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-config-data-custom\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.708963 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-config-data\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.708986 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-internal-tls-certs\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.714059 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb469864-9053-4551-bca7-f3b67a20bf52-logs\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.718172 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-internal-tls-certs\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.718326 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-public-tls-certs\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.719412 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-combined-ca-bundle\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.720365 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-config-data\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.725664 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb469864-9053-4551-bca7-f3b67a20bf52-config-data-custom\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.728682 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-589r7\" (UniqueName: \"kubernetes.io/projected/cb469864-9053-4551-bca7-f3b67a20bf52-kube-api-access-589r7\") pod \"barbican-api-67f46ddd46-qcgrk\" (UID: \"cb469864-9053-4551-bca7-f3b67a20bf52\") " pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:21 crc kubenswrapper[4715]: I1009 08:04:21.832941 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:22 crc kubenswrapper[4715]: I1009 08:04:22.148160 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33dae8b1-9611-43f9-8dc3-5bccd2694cca" path="/var/lib/kubelet/pods/33dae8b1-9611-43f9-8dc3-5bccd2694cca/volumes" Oct 09 08:04:22 crc kubenswrapper[4715]: I1009 08:04:22.148927 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1" path="/var/lib/kubelet/pods/bf3a2f86-c9dd-4b22-96f0-d9ffb49c80d1/volumes" Oct 09 08:04:22 crc kubenswrapper[4715]: I1009 08:04:22.495694 4715 scope.go:117] "RemoveContainer" containerID="684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283" Oct 09 08:04:22 crc kubenswrapper[4715]: E1009 08:04:22.496214 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283\": container with ID starting with 684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283 not found: ID does not exist" containerID="684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283" Oct 09 08:04:22 crc kubenswrapper[4715]: I1009 08:04:22.496247 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283"} err="failed to get container status \"684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283\": rpc error: code = NotFound desc = could not find container \"684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283\": container with ID starting with 684413e3233be95c46d8416055ffa05873471fa6d52480b9fc4e0d43c852c283 not found: ID does not exist" Oct 09 08:04:22 crc kubenswrapper[4715]: I1009 08:04:22.496268 4715 scope.go:117] "RemoveContainer" containerID="446de50489e25be996e97280afffa0f71d04e7c14c0e8f53f3e27441a2a21134" Oct 09 08:04:22 crc kubenswrapper[4715]: E1009 08:04:22.496653 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446de50489e25be996e97280afffa0f71d04e7c14c0e8f53f3e27441a2a21134\": container with ID starting with 446de50489e25be996e97280afffa0f71d04e7c14c0e8f53f3e27441a2a21134 not found: ID does not exist" containerID="446de50489e25be996e97280afffa0f71d04e7c14c0e8f53f3e27441a2a21134" Oct 09 08:04:22 crc kubenswrapper[4715]: I1009 08:04:22.496677 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446de50489e25be996e97280afffa0f71d04e7c14c0e8f53f3e27441a2a21134"} err="failed to get container status \"446de50489e25be996e97280afffa0f71d04e7c14c0e8f53f3e27441a2a21134\": rpc error: code = NotFound desc = could not find container \"446de50489e25be996e97280afffa0f71d04e7c14c0e8f53f3e27441a2a21134\": container with ID starting with 446de50489e25be996e97280afffa0f71d04e7c14c0e8f53f3e27441a2a21134 not found: ID does not exist" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.007311 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"50651334-64c9-4214-9c7b-c10c4152d053","Type":"ContainerStarted","Data":"b2e8af33e67c106db42706d825b517be9ee34453ad6cb0c3f073041c71edd1d8"} Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.009130 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" event={"ID":"50cf187d-781d-49b7-840b-8dfc3366135f","Type":"ContainerStarted","Data":"b34585ba017489a5a7d54aa97d9e4514b5f53c93f842e4955c173103ddf8f08c"} Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.010890 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ff4c65c6c-97dbp" event={"ID":"6d84057b-c735-4df6-a20d-ef88cccb44fe","Type":"ContainerStarted","Data":"5c7bba205f6a870b718c83f66816863b2490be9665c3c58d37b5a9aad53ee778"} Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.094141 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-67f46ddd46-qcgrk"] Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.239537 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-9d9d4b647-cfdjf"] Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.241036 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.245804 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.245927 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.246199 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.254925 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9d9d4b647-cfdjf"] Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.339063 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-internal-tls-certs\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.339129 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-log-httpd\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.339225 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-run-httpd\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.339253 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcqlb\" (UniqueName: \"kubernetes.io/projected/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-kube-api-access-fcqlb\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.339292 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-combined-ca-bundle\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.339316 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-public-tls-certs\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.339357 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-config-data\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.339406 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-etc-swift\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.440951 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-config-data\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.441023 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-etc-swift\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.441611 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-internal-tls-certs\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.441652 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-log-httpd\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.441741 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-run-httpd\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.441784 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcqlb\" (UniqueName: \"kubernetes.io/projected/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-kube-api-access-fcqlb\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.441851 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-combined-ca-bundle\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.441871 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-public-tls-certs\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.442243 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-log-httpd\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.443207 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-run-httpd\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.448044 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-public-tls-certs\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.448322 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-etc-swift\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.449288 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-combined-ca-bundle\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.452235 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-config-data\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.456651 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-internal-tls-certs\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.463469 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcqlb\" (UniqueName: \"kubernetes.io/projected/6f4b9cb5-f128-44e3-9142-2d39d79cb0b8-kube-api-access-fcqlb\") pod \"swift-proxy-9d9d4b647-cfdjf\" (UID: \"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8\") " pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:23 crc kubenswrapper[4715]: I1009 08:04:23.587006 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.030596 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"50651334-64c9-4214-9c7b-c10c4152d053","Type":"ContainerStarted","Data":"45f3eaaa67fd9c4968b7564f4abd3f71a6cfed945caaa4b090127be476d79c06"} Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.042324 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" event={"ID":"50cf187d-781d-49b7-840b-8dfc3366135f","Type":"ContainerStarted","Data":"44cef0183cba929c55c2548bd53a31da605feffb03ca8e2802500ab590b561a2"} Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.051123 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ff4c65c6c-97dbp" event={"ID":"6d84057b-c735-4df6-a20d-ef88cccb44fe","Type":"ContainerStarted","Data":"8f546005c5f2b390715eff6b91517a16fdcd76c84bd30b3642d63f6e82f3dbfe"} Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.075696 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-58cfd869c4-4djjf" podStartSLOduration=2.6055315869999998 podStartE2EDuration="6.075677181s" podCreationTimestamp="2025-10-09 08:04:18 +0000 UTC" firstStartedPulling="2025-10-09 08:04:19.124806563 +0000 UTC m=+1089.817610571" lastFinishedPulling="2025-10-09 08:04:22.594952157 +0000 UTC m=+1093.287756165" observedRunningTime="2025-10-09 08:04:24.069310525 +0000 UTC m=+1094.762114533" watchObservedRunningTime="2025-10-09 08:04:24.075677181 +0000 UTC m=+1094.768481189" Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.086063 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67f46ddd46-qcgrk" event={"ID":"cb469864-9053-4551-bca7-f3b67a20bf52","Type":"ContainerStarted","Data":"a33f66fca4d166e7882c8b9e547c506eae4dcc7badad3f3aad4063788fa326c1"} Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.086123 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67f46ddd46-qcgrk" event={"ID":"cb469864-9053-4551-bca7-f3b67a20bf52","Type":"ContainerStarted","Data":"d7a02ecf05bc6cb3185f031814c08053712c0b318ec102e5ae6e47dffd159cad"} Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.086139 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67f46ddd46-qcgrk" event={"ID":"cb469864-9053-4551-bca7-f3b67a20bf52","Type":"ContainerStarted","Data":"9c6dbb670e6abc36879135bd1a376246cd7025f5c7eba9da91b28e271b17157f"} Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.087104 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.087146 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.131005 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6ff4c65c6c-97dbp" podStartSLOduration=2.670161892 podStartE2EDuration="6.130978684s" podCreationTimestamp="2025-10-09 08:04:18 +0000 UTC" firstStartedPulling="2025-10-09 08:04:19.133160897 +0000 UTC m=+1089.825964905" lastFinishedPulling="2025-10-09 08:04:22.593977689 +0000 UTC m=+1093.286781697" observedRunningTime="2025-10-09 08:04:24.115835212 +0000 UTC m=+1094.808639220" watchObservedRunningTime="2025-10-09 08:04:24.130978684 +0000 UTC m=+1094.823782692" Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.190951 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-67f46ddd46-qcgrk" podStartSLOduration=3.190928862 podStartE2EDuration="3.190928862s" podCreationTimestamp="2025-10-09 08:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:24.157211439 +0000 UTC m=+1094.850015447" watchObservedRunningTime="2025-10-09 08:04:24.190928862 +0000 UTC m=+1094.883732870" Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.270163 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9d9d4b647-cfdjf"] Oct 09 08:04:24 crc kubenswrapper[4715]: W1009 08:04:24.276756 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f4b9cb5_f128_44e3_9142_2d39d79cb0b8.slice/crio-39a57abfdbe61e9c4b3aaab231ac1b97e9ca981ecd24a8a533e81bd430f1e799 WatchSource:0}: Error finding container 39a57abfdbe61e9c4b3aaab231ac1b97e9ca981ecd24a8a533e81bd430f1e799: Status 404 returned error can't find the container with id 39a57abfdbe61e9c4b3aaab231ac1b97e9ca981ecd24a8a533e81bd430f1e799 Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.313931 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.314185 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="ceilometer-central-agent" containerID="cri-o://06e3a016ccf1feffe4e8bbc3185ff5bb6f65ac453d4e58b04b827106491f1919" gracePeriod=30 Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.314347 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="proxy-httpd" containerID="cri-o://e0875b29b7c74d81e763ff018a1895e8fd2652900b8e9a903413e7c40d8540b5" gracePeriod=30 Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.314391 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="sg-core" containerID="cri-o://a412f5011889d1602ee9c6e8425ae5cfc6ec79df9d2ceeb5e45194df1b5ca559" gracePeriod=30 Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.314467 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="ceilometer-notification-agent" containerID="cri-o://179f31ef5d83b939a24a6248b3f608ec2de7bb69229edbfd596d0fe2ec54c2e1" gracePeriod=30 Oct 09 08:04:24 crc kubenswrapper[4715]: I1009 08:04:24.329488 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.104283 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"50651334-64c9-4214-9c7b-c10c4152d053","Type":"ContainerStarted","Data":"bda286cbf7f04363c945d00900fb951c5d6ce2fc0a8053b4bde210c9541a7a25"} Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.104850 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.108329 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9d9d4b647-cfdjf" event={"ID":"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8","Type":"ContainerStarted","Data":"d92792158e1168514479098189ed3981c1af5e023948d73904463dde91e2db53"} Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.108366 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9d9d4b647-cfdjf" event={"ID":"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8","Type":"ContainerStarted","Data":"9bc295a10500adb3bcc005829fbba72cc4688ad8e744eb295ae83a4292c4320f"} Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.108378 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9d9d4b647-cfdjf" event={"ID":"6f4b9cb5-f128-44e3-9142-2d39d79cb0b8","Type":"ContainerStarted","Data":"39a57abfdbe61e9c4b3aaab231ac1b97e9ca981ecd24a8a533e81bd430f1e799"} Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.108469 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.108503 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.111741 4715 generic.go:334] "Generic (PLEG): container finished" podID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerID="e0875b29b7c74d81e763ff018a1895e8fd2652900b8e9a903413e7c40d8540b5" exitCode=0 Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.111770 4715 generic.go:334] "Generic (PLEG): container finished" podID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerID="a412f5011889d1602ee9c6e8425ae5cfc6ec79df9d2ceeb5e45194df1b5ca559" exitCode=2 Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.111777 4715 generic.go:334] "Generic (PLEG): container finished" podID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerID="06e3a016ccf1feffe4e8bbc3185ff5bb6f65ac453d4e58b04b827106491f1919" exitCode=0 Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.112366 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9c0be8b-0711-488f-9bb1-edee8e92e527","Type":"ContainerDied","Data":"e0875b29b7c74d81e763ff018a1895e8fd2652900b8e9a903413e7c40d8540b5"} Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.112400 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9c0be8b-0711-488f-9bb1-edee8e92e527","Type":"ContainerDied","Data":"a412f5011889d1602ee9c6e8425ae5cfc6ec79df9d2ceeb5e45194df1b5ca559"} Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.112411 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9c0be8b-0711-488f-9bb1-edee8e92e527","Type":"ContainerDied","Data":"06e3a016ccf1feffe4e8bbc3185ff5bb6f65ac453d4e58b04b827106491f1919"} Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.140601 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.140581268 podStartE2EDuration="5.140581268s" podCreationTimestamp="2025-10-09 08:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:25.136033265 +0000 UTC m=+1095.828837273" watchObservedRunningTime="2025-10-09 08:04:25.140581268 +0000 UTC m=+1095.833385296" Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.552713 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.569156 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-9d9d4b647-cfdjf" podStartSLOduration=2.569135546 podStartE2EDuration="2.569135546s" podCreationTimestamp="2025-10-09 08:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:25.154994749 +0000 UTC m=+1095.847798757" watchObservedRunningTime="2025-10-09 08:04:25.569135546 +0000 UTC m=+1096.261939554" Oct 09 08:04:25 crc kubenswrapper[4715]: I1009 08:04:25.583642 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 08:04:26 crc kubenswrapper[4715]: I1009 08:04:26.124893 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="08ca7802-fc81-4258-bfef-c598c9d65b2f" containerName="cinder-scheduler" containerID="cri-o://0dafd83d1e12bb3fb83b77412a258914b924b61fd2d3081e52340af48f99d474" gracePeriod=30 Oct 09 08:04:26 crc kubenswrapper[4715]: I1009 08:04:26.124953 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="08ca7802-fc81-4258-bfef-c598c9d65b2f" containerName="probe" containerID="cri-o://a29fbe68284573f78517485f7f15db808ab704e8651398c128ad76625bfd0d10" gracePeriod=30 Oct 09 08:04:28 crc kubenswrapper[4715]: I1009 08:04:28.157245 4715 generic.go:334] "Generic (PLEG): container finished" podID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerID="179f31ef5d83b939a24a6248b3f608ec2de7bb69229edbfd596d0fe2ec54c2e1" exitCode=0 Oct 09 08:04:28 crc kubenswrapper[4715]: I1009 08:04:28.157666 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9c0be8b-0711-488f-9bb1-edee8e92e527","Type":"ContainerDied","Data":"179f31ef5d83b939a24a6248b3f608ec2de7bb69229edbfd596d0fe2ec54c2e1"} Oct 09 08:04:28 crc kubenswrapper[4715]: I1009 08:04:28.160480 4715 generic.go:334] "Generic (PLEG): container finished" podID="08ca7802-fc81-4258-bfef-c598c9d65b2f" containerID="a29fbe68284573f78517485f7f15db808ab704e8651398c128ad76625bfd0d10" exitCode=0 Oct 09 08:04:28 crc kubenswrapper[4715]: I1009 08:04:28.160541 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"08ca7802-fc81-4258-bfef-c598c9d65b2f","Type":"ContainerDied","Data":"a29fbe68284573f78517485f7f15db808ab704e8651398c128ad76625bfd0d10"} Oct 09 08:04:28 crc kubenswrapper[4715]: I1009 08:04:28.877661 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:04:28 crc kubenswrapper[4715]: I1009 08:04:28.897247 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5d9885b95b-r2cb2" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 09 08:04:28 crc kubenswrapper[4715]: I1009 08:04:28.897366 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:04:28 crc kubenswrapper[4715]: I1009 08:04:28.959649 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-7fpxg"] Oct 09 08:04:28 crc kubenswrapper[4715]: I1009 08:04:28.959980 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" podUID="fb56b8fd-4a84-44aa-a3c9-80aefa10784e" containerName="dnsmasq-dns" containerID="cri-o://de1bfe0badc20cbbb61faee7548dfd783c5d13dd7e2bbc2c662b7623ba5abc0f" gracePeriod=10 Oct 09 08:04:29 crc kubenswrapper[4715]: I1009 08:04:29.180091 4715 generic.go:334] "Generic (PLEG): container finished" podID="fb56b8fd-4a84-44aa-a3c9-80aefa10784e" containerID="de1bfe0badc20cbbb61faee7548dfd783c5d13dd7e2bbc2c662b7623ba5abc0f" exitCode=0 Oct 09 08:04:29 crc kubenswrapper[4715]: I1009 08:04:29.180136 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" event={"ID":"fb56b8fd-4a84-44aa-a3c9-80aefa10784e","Type":"ContainerDied","Data":"de1bfe0badc20cbbb61faee7548dfd783c5d13dd7e2bbc2c662b7623ba5abc0f"} Oct 09 08:04:29 crc kubenswrapper[4715]: I1009 08:04:29.358243 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:04:29 crc kubenswrapper[4715]: I1009 08:04:29.358780 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" containerName="glance-log" containerID="cri-o://2e03df95b6bb57fb4fb7f4d90cc8e45fe2c8a0dcc0049252dcba3306899bb1cf" gracePeriod=30 Oct 09 08:04:29 crc kubenswrapper[4715]: I1009 08:04:29.359253 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" containerName="glance-httpd" containerID="cri-o://f66ee5e8144456200dbf81ee9e79d51c129e85e278fa1cd8441aebdf53debc5b" gracePeriod=30 Oct 09 08:04:30 crc kubenswrapper[4715]: I1009 08:04:30.204400 4715 generic.go:334] "Generic (PLEG): container finished" podID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" containerID="2e03df95b6bb57fb4fb7f4d90cc8e45fe2c8a0dcc0049252dcba3306899bb1cf" exitCode=143 Oct 09 08:04:30 crc kubenswrapper[4715]: I1009 08:04:30.204473 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1","Type":"ContainerDied","Data":"2e03df95b6bb57fb4fb7f4d90cc8e45fe2c8a0dcc0049252dcba3306899bb1cf"} Oct 09 08:04:30 crc kubenswrapper[4715]: I1009 08:04:30.272284 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:04:30 crc kubenswrapper[4715]: I1009 08:04:30.272574 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" containerName="glance-log" containerID="cri-o://29c69c6bfea7f324ed2cb89b3d6fdc974a76c48f88d4ce8852af81d57091e21f" gracePeriod=30 Oct 09 08:04:30 crc kubenswrapper[4715]: I1009 08:04:30.272720 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" containerName="glance-httpd" containerID="cri-o://1abb0d014478a89f86f7edba03186c78afcc89f7c2def49c541347d933b384c9" gracePeriod=30 Oct 09 08:04:30 crc kubenswrapper[4715]: I1009 08:04:30.348632 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:30 crc kubenswrapper[4715]: I1009 08:04:30.403141 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:31 crc kubenswrapper[4715]: I1009 08:04:31.222770 4715 generic.go:334] "Generic (PLEG): container finished" podID="08ca7802-fc81-4258-bfef-c598c9d65b2f" containerID="0dafd83d1e12bb3fb83b77412a258914b924b61fd2d3081e52340af48f99d474" exitCode=0 Oct 09 08:04:31 crc kubenswrapper[4715]: I1009 08:04:31.222858 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"08ca7802-fc81-4258-bfef-c598c9d65b2f","Type":"ContainerDied","Data":"0dafd83d1e12bb3fb83b77412a258914b924b61fd2d3081e52340af48f99d474"} Oct 09 08:04:31 crc kubenswrapper[4715]: I1009 08:04:31.225054 4715 generic.go:334] "Generic (PLEG): container finished" podID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" containerID="29c69c6bfea7f324ed2cb89b3d6fdc974a76c48f88d4ce8852af81d57091e21f" exitCode=143 Oct 09 08:04:31 crc kubenswrapper[4715]: I1009 08:04:31.226178 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fa8c6cfe-dae5-455a-a3f4-83608f3064b5","Type":"ContainerDied","Data":"29c69c6bfea7f324ed2cb89b3d6fdc974a76c48f88d4ce8852af81d57091e21f"} Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.520489 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": read tcp 10.217.0.2:51048->10.217.0.150:9292: read: connection reset by peer" Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.521699 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": read tcp 10.217.0.2:51034->10.217.0.150:9292: read: connection reset by peer" Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.550275 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" podUID="fb56b8fd-4a84-44aa-a3c9-80aefa10784e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.730837 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vtjlz"] Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.732050 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vtjlz" Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.755783 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vtjlz"] Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.834521 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-kp8jh"] Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.835571 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kp8jh" Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.842888 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzhdp\" (UniqueName: \"kubernetes.io/projected/12f445bb-022c-4dd9-8f91-e9612f526a12-kube-api-access-gzhdp\") pod \"nova-api-db-create-vtjlz\" (UID: \"12f445bb-022c-4dd9-8f91-e9612f526a12\") " pod="openstack/nova-api-db-create-vtjlz" Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.851438 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kp8jh"] Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.944167 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzhdp\" (UniqueName: \"kubernetes.io/projected/12f445bb-022c-4dd9-8f91-e9612f526a12-kube-api-access-gzhdp\") pod \"nova-api-db-create-vtjlz\" (UID: \"12f445bb-022c-4dd9-8f91-e9612f526a12\") " pod="openstack/nova-api-db-create-vtjlz" Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.944240 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssdxz\" (UniqueName: \"kubernetes.io/projected/544d96bc-6a19-46c5-8162-64a99e333681-kube-api-access-ssdxz\") pod \"nova-cell0-db-create-kp8jh\" (UID: \"544d96bc-6a19-46c5-8162-64a99e333681\") " pod="openstack/nova-cell0-db-create-kp8jh" Oct 09 08:04:32 crc kubenswrapper[4715]: I1009 08:04:32.981705 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzhdp\" (UniqueName: \"kubernetes.io/projected/12f445bb-022c-4dd9-8f91-e9612f526a12-kube-api-access-gzhdp\") pod \"nova-api-db-create-vtjlz\" (UID: \"12f445bb-022c-4dd9-8f91-e9612f526a12\") " pod="openstack/nova-api-db-create-vtjlz" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.033754 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-28p9v"] Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.035203 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-28p9v" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.046083 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssdxz\" (UniqueName: \"kubernetes.io/projected/544d96bc-6a19-46c5-8162-64a99e333681-kube-api-access-ssdxz\") pod \"nova-cell0-db-create-kp8jh\" (UID: \"544d96bc-6a19-46c5-8162-64a99e333681\") " pod="openstack/nova-cell0-db-create-kp8jh" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.055188 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vtjlz" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.057847 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-28p9v"] Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.084936 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssdxz\" (UniqueName: \"kubernetes.io/projected/544d96bc-6a19-46c5-8162-64a99e333681-kube-api-access-ssdxz\") pod \"nova-cell0-db-create-kp8jh\" (UID: \"544d96bc-6a19-46c5-8162-64a99e333681\") " pod="openstack/nova-cell0-db-create-kp8jh" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.147733 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf2jw\" (UniqueName: \"kubernetes.io/projected/80579efc-c70d-41a5-8a56-922ca09a8bd4-kube-api-access-rf2jw\") pod \"nova-cell1-db-create-28p9v\" (UID: \"80579efc-c70d-41a5-8a56-922ca09a8bd4\") " pod="openstack/nova-cell1-db-create-28p9v" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.196826 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kp8jh" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.243488 4715 generic.go:334] "Generic (PLEG): container finished" podID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" containerID="f66ee5e8144456200dbf81ee9e79d51c129e85e278fa1cd8441aebdf53debc5b" exitCode=0 Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.243549 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1","Type":"ContainerDied","Data":"f66ee5e8144456200dbf81ee9e79d51c129e85e278fa1cd8441aebdf53debc5b"} Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.249866 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf2jw\" (UniqueName: \"kubernetes.io/projected/80579efc-c70d-41a5-8a56-922ca09a8bd4-kube-api-access-rf2jw\") pod \"nova-cell1-db-create-28p9v\" (UID: \"80579efc-c70d-41a5-8a56-922ca09a8bd4\") " pod="openstack/nova-cell1-db-create-28p9v" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.274988 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf2jw\" (UniqueName: \"kubernetes.io/projected/80579efc-c70d-41a5-8a56-922ca09a8bd4-kube-api-access-rf2jw\") pod \"nova-cell1-db-create-28p9v\" (UID: \"80579efc-c70d-41a5-8a56-922ca09a8bd4\") " pod="openstack/nova-cell1-db-create-28p9v" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.360360 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-28p9v" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.493057 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:41152->10.217.0.149:9292: read: connection reset by peer" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.493104 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:41156->10.217.0.149:9292: read: connection reset by peer" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.602485 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:33 crc kubenswrapper[4715]: I1009 08:04:33.606806 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9d9d4b647-cfdjf" Oct 09 08:04:34 crc kubenswrapper[4715]: I1009 08:04:34.153211 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:34 crc kubenswrapper[4715]: I1009 08:04:34.153305 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6985f6958d-wwgg5" Oct 09 08:04:34 crc kubenswrapper[4715]: I1009 08:04:34.279983 4715 generic.go:334] "Generic (PLEG): container finished" podID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" containerID="1abb0d014478a89f86f7edba03186c78afcc89f7c2def49c541347d933b384c9" exitCode=0 Oct 09 08:04:34 crc kubenswrapper[4715]: I1009 08:04:34.280055 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fa8c6cfe-dae5-455a-a3f4-83608f3064b5","Type":"ContainerDied","Data":"1abb0d014478a89f86f7edba03186c78afcc89f7c2def49c541347d933b384c9"} Oct 09 08:04:34 crc kubenswrapper[4715]: I1009 08:04:34.290302 4715 generic.go:334] "Generic (PLEG): container finished" podID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerID="30df9f4b9017305ddf0ba538bc4401ae44d2c84af182eb159fea1745dd9773d7" exitCode=137 Oct 09 08:04:34 crc kubenswrapper[4715]: I1009 08:04:34.290644 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d9885b95b-r2cb2" event={"ID":"ada9982a-fc5f-4c93-bfa3-3401c0824c2e","Type":"ContainerDied","Data":"30df9f4b9017305ddf0ba538bc4401ae44d2c84af182eb159fea1745dd9773d7"} Oct 09 08:04:34 crc kubenswrapper[4715]: I1009 08:04:34.837597 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:35 crc kubenswrapper[4715]: E1009 08:04:35.048865 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Oct 09 08:04:35 crc kubenswrapper[4715]: E1009 08:04:35.049059 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55fh5fbhd4h5cch576h56h648h76h5ch675h587h89h656h5b6h5d5h86h5f4h58fh697h5bchdbh684h5bfh68bh5d6h65h97h69h67ch5fch549h567q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fnvcq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(ea37012b-c593-4cd0-8501-121c791b2741): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:04:35 crc kubenswrapper[4715]: E1009 08:04:35.051541 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="ea37012b-c593-4cd0-8501-121c791b2741" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.094108 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 09 08:04:35 crc kubenswrapper[4715]: E1009 08:04:35.385631 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="ea37012b-c593-4cd0-8501-121c791b2741" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.423967 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.500257 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9c0be8b-0711-488f-9bb1-edee8e92e527-log-httpd\") pod \"f9c0be8b-0711-488f-9bb1-edee8e92e527\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.500337 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-combined-ca-bundle\") pod \"f9c0be8b-0711-488f-9bb1-edee8e92e527\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.500372 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-config-data\") pod \"f9c0be8b-0711-488f-9bb1-edee8e92e527\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.500507 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn5bn\" (UniqueName: \"kubernetes.io/projected/f9c0be8b-0711-488f-9bb1-edee8e92e527-kube-api-access-jn5bn\") pod \"f9c0be8b-0711-488f-9bb1-edee8e92e527\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.500607 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9c0be8b-0711-488f-9bb1-edee8e92e527-run-httpd\") pod \"f9c0be8b-0711-488f-9bb1-edee8e92e527\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.500672 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-sg-core-conf-yaml\") pod \"f9c0be8b-0711-488f-9bb1-edee8e92e527\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.500730 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-scripts\") pod \"f9c0be8b-0711-488f-9bb1-edee8e92e527\" (UID: \"f9c0be8b-0711-488f-9bb1-edee8e92e527\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.507113 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c0be8b-0711-488f-9bb1-edee8e92e527-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f9c0be8b-0711-488f-9bb1-edee8e92e527" (UID: "f9c0be8b-0711-488f-9bb1-edee8e92e527"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.507705 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c0be8b-0711-488f-9bb1-edee8e92e527-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f9c0be8b-0711-488f-9bb1-edee8e92e527" (UID: "f9c0be8b-0711-488f-9bb1-edee8e92e527"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.516634 4715 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9c0be8b-0711-488f-9bb1-edee8e92e527-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.516691 4715 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f9c0be8b-0711-488f-9bb1-edee8e92e527-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.550633 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c0be8b-0711-488f-9bb1-edee8e92e527-kube-api-access-jn5bn" (OuterVolumeSpecName: "kube-api-access-jn5bn") pod "f9c0be8b-0711-488f-9bb1-edee8e92e527" (UID: "f9c0be8b-0711-488f-9bb1-edee8e92e527"). InnerVolumeSpecName "kube-api-access-jn5bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.562762 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-scripts" (OuterVolumeSpecName: "scripts") pod "f9c0be8b-0711-488f-9bb1-edee8e92e527" (UID: "f9c0be8b-0711-488f-9bb1-edee8e92e527"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.611795 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f9c0be8b-0711-488f-9bb1-edee8e92e527" (UID: "f9c0be8b-0711-488f-9bb1-edee8e92e527"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.618443 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.618480 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn5bn\" (UniqueName: \"kubernetes.io/projected/f9c0be8b-0711-488f-9bb1-edee8e92e527-kube-api-access-jn5bn\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.618492 4715 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.808565 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9c0be8b-0711-488f-9bb1-edee8e92e527" (UID: "f9c0be8b-0711-488f-9bb1-edee8e92e527"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.812681 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.828630 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.831662 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-config-data" (OuterVolumeSpecName: "config-data") pod "f9c0be8b-0711-488f-9bb1-edee8e92e527" (UID: "f9c0be8b-0711-488f-9bb1-edee8e92e527"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.878058 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-67f46ddd46-qcgrk" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.935489 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-horizon-secret-key\") pod \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.935577 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-scripts\") pod \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.935628 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-config-data\") pod \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.935714 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-logs\") pod \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.935758 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-horizon-tls-certs\") pod \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.935803 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-combined-ca-bundle\") pod \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.935901 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g5cn\" (UniqueName: \"kubernetes.io/projected/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-kube-api-access-8g5cn\") pod \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\" (UID: \"ada9982a-fc5f-4c93-bfa3-3401c0824c2e\") " Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.936303 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c0be8b-0711-488f-9bb1-edee8e92e527-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.938052 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-logs" (OuterVolumeSpecName: "logs") pod "ada9982a-fc5f-4c93-bfa3-3401c0824c2e" (UID: "ada9982a-fc5f-4c93-bfa3-3401c0824c2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.940998 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ada9982a-fc5f-4c93-bfa3-3401c0824c2e" (UID: "ada9982a-fc5f-4c93-bfa3-3401c0824c2e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.951435 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-kube-api-access-8g5cn" (OuterVolumeSpecName: "kube-api-access-8g5cn") pod "ada9982a-fc5f-4c93-bfa3-3401c0824c2e" (UID: "ada9982a-fc5f-4c93-bfa3-3401c0824c2e"). InnerVolumeSpecName "kube-api-access-8g5cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.954489 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59756554bd-9q7xp"] Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.954728 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59756554bd-9q7xp" podUID="5823462b-a07e-4525-9be8-370dce870498" containerName="barbican-api-log" containerID="cri-o://16fdac3fb6c5d655e385897f80db5d30b67e106a81ba59d1fd09b8e88ae2486d" gracePeriod=30 Oct 09 08:04:35 crc kubenswrapper[4715]: I1009 08:04:35.955172 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59756554bd-9q7xp" podUID="5823462b-a07e-4525-9be8-370dce870498" containerName="barbican-api" containerID="cri-o://ac6032261715516b1ba1cfe9851948275ebc4124882ad0151121b557b399837a" gracePeriod=30 Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.024278 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-config-data" (OuterVolumeSpecName: "config-data") pod "ada9982a-fc5f-4c93-bfa3-3401c0824c2e" (UID: "ada9982a-fc5f-4c93-bfa3-3401c0824c2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.044930 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g5cn\" (UniqueName: \"kubernetes.io/projected/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-kube-api-access-8g5cn\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.045589 4715 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.045630 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.045642 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.062021 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-scripts" (OuterVolumeSpecName: "scripts") pod "ada9982a-fc5f-4c93-bfa3-3401c0824c2e" (UID: "ada9982a-fc5f-4c93-bfa3-3401c0824c2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.119803 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ada9982a-fc5f-4c93-bfa3-3401c0824c2e" (UID: "ada9982a-fc5f-4c93-bfa3-3401c0824c2e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.123806 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ada9982a-fc5f-4c93-bfa3-3401c0824c2e" (UID: "ada9982a-fc5f-4c93-bfa3-3401c0824c2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.150779 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.150819 4715 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.150838 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada9982a-fc5f-4c93-bfa3-3401c0824c2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.330227 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.347784 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.377891 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.416266 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.416704 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"08ca7802-fc81-4258-bfef-c598c9d65b2f","Type":"ContainerDied","Data":"48b1600f361bb4d0b91a695104162349d982aea78c6e3b8e5d68fd50a8ef86fa"} Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.416748 4715 scope.go:117] "RemoveContainer" containerID="a29fbe68284573f78517485f7f15db808ab704e8651398c128ad76625bfd0d10" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.447735 4715 generic.go:334] "Generic (PLEG): container finished" podID="5823462b-a07e-4525-9be8-370dce870498" containerID="16fdac3fb6c5d655e385897f80db5d30b67e106a81ba59d1fd09b8e88ae2486d" exitCode=143 Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.447811 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59756554bd-9q7xp" event={"ID":"5823462b-a07e-4525-9be8-370dce870498","Type":"ContainerDied","Data":"16fdac3fb6c5d655e385897f80db5d30b67e106a81ba59d1fd09b8e88ae2486d"} Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.458081 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d9885b95b-r2cb2" event={"ID":"ada9982a-fc5f-4c93-bfa3-3401c0824c2e","Type":"ContainerDied","Data":"5eb867aac698accf445bcda1555a89eb888f82c2528dd5180cd9c22862dd5490"} Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.458191 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d9885b95b-r2cb2" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.461705 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-config\") pod \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.461776 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-logs\") pod \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.461805 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-ovsdbserver-sb\") pod \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.461849 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-dns-svc\") pod \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.461884 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-config-data\") pod \"08ca7802-fc81-4258-bfef-c598c9d65b2f\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.461943 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxlfp\" (UniqueName: \"kubernetes.io/projected/08ca7802-fc81-4258-bfef-c598c9d65b2f-kube-api-access-dxlfp\") pod \"08ca7802-fc81-4258-bfef-c598c9d65b2f\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.461995 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-combined-ca-bundle\") pod \"08ca7802-fc81-4258-bfef-c598c9d65b2f\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462034 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-ovsdbserver-nb\") pod \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462096 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b92vw\" (UniqueName: \"kubernetes.io/projected/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-kube-api-access-b92vw\") pod \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462121 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-scripts\") pod \"08ca7802-fc81-4258-bfef-c598c9d65b2f\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462146 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-combined-ca-bundle\") pod \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462176 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-scripts\") pod \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462204 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv6dv\" (UniqueName: \"kubernetes.io/projected/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-kube-api-access-qv6dv\") pod \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462235 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-config-data\") pod \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462253 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-httpd-run\") pod \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462275 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-internal-tls-certs\") pod \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462303 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08ca7802-fc81-4258-bfef-c598c9d65b2f-etc-machine-id\") pod \"08ca7802-fc81-4258-bfef-c598c9d65b2f\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462340 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\" (UID: \"fa8c6cfe-dae5-455a-a3f4-83608f3064b5\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462359 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-dns-swift-storage-0\") pod \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\" (UID: \"fb56b8fd-4a84-44aa-a3c9-80aefa10784e\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.462478 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-config-data-custom\") pod \"08ca7802-fc81-4258-bfef-c598c9d65b2f\" (UID: \"08ca7802-fc81-4258-bfef-c598c9d65b2f\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.465961 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" event={"ID":"fb56b8fd-4a84-44aa-a3c9-80aefa10784e","Type":"ContainerDied","Data":"93ff2c1d1458261803c0df4e70c4bd9ef1e0686fccbc3f27478885f0b8401919"} Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.466978 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fa8c6cfe-dae5-455a-a3f4-83608f3064b5" (UID: "fa8c6cfe-dae5-455a-a3f4-83608f3064b5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.467019 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-7fpxg" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.467466 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08ca7802-fc81-4258-bfef-c598c9d65b2f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "08ca7802-fc81-4258-bfef-c598c9d65b2f" (UID: "08ca7802-fc81-4258-bfef-c598c9d65b2f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.468085 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08ca7802-fc81-4258-bfef-c598c9d65b2f" (UID: "08ca7802-fc81-4258-bfef-c598c9d65b2f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.482011 4715 scope.go:117] "RemoveContainer" containerID="0dafd83d1e12bb3fb83b77412a258914b924b61fd2d3081e52340af48f99d474" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.482085 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-logs" (OuterVolumeSpecName: "logs") pod "fa8c6cfe-dae5-455a-a3f4-83608f3064b5" (UID: "fa8c6cfe-dae5-455a-a3f4-83608f3064b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.492511 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "fa8c6cfe-dae5-455a-a3f4-83608f3064b5" (UID: "fa8c6cfe-dae5-455a-a3f4-83608f3064b5"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.494509 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-scripts" (OuterVolumeSpecName: "scripts") pod "08ca7802-fc81-4258-bfef-c598c9d65b2f" (UID: "08ca7802-fc81-4258-bfef-c598c9d65b2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.494734 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-scripts" (OuterVolumeSpecName: "scripts") pod "fa8c6cfe-dae5-455a-a3f4-83608f3064b5" (UID: "fa8c6cfe-dae5-455a-a3f4-83608f3064b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.497561 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-kube-api-access-qv6dv" (OuterVolumeSpecName: "kube-api-access-qv6dv") pod "fa8c6cfe-dae5-455a-a3f4-83608f3064b5" (UID: "fa8c6cfe-dae5-455a-a3f4-83608f3064b5"). InnerVolumeSpecName "kube-api-access-qv6dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.497725 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-kube-api-access-b92vw" (OuterVolumeSpecName: "kube-api-access-b92vw") pod "fb56b8fd-4a84-44aa-a3c9-80aefa10784e" (UID: "fb56b8fd-4a84-44aa-a3c9-80aefa10784e"). InnerVolumeSpecName "kube-api-access-b92vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.524341 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f9c0be8b-0711-488f-9bb1-edee8e92e527","Type":"ContainerDied","Data":"94fda7de3823f725a433b7648fc555ae46934b8733034688c13fdfe015b84798"} Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.524497 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.559683 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ca7802-fc81-4258-bfef-c598c9d65b2f-kube-api-access-dxlfp" (OuterVolumeSpecName: "kube-api-access-dxlfp") pod "08ca7802-fc81-4258-bfef-c598c9d65b2f" (UID: "08ca7802-fc81-4258-bfef-c598c9d65b2f"). InnerVolumeSpecName "kube-api-access-dxlfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.560361 4715 scope.go:117] "RemoveContainer" containerID="a38b8e8bdafd561f301d7de3849df03039ec334d2e9911426537e38b58e93a3e" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.561665 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fa8c6cfe-dae5-455a-a3f4-83608f3064b5","Type":"ContainerDied","Data":"baf39a9f3334dc00fb6af8d58712a5b108840a8b359d9a6ba9ceedba94d50dba"} Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.561770 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.565163 4715 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.565192 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.565204 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxlfp\" (UniqueName: \"kubernetes.io/projected/08ca7802-fc81-4258-bfef-c598c9d65b2f-kube-api-access-dxlfp\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.565213 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b92vw\" (UniqueName: \"kubernetes.io/projected/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-kube-api-access-b92vw\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.565221 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.565229 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.565238 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv6dv\" (UniqueName: \"kubernetes.io/projected/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-kube-api-access-qv6dv\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.565246 4715 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.565254 4715 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/08ca7802-fc81-4258-bfef-c598c9d65b2f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.565277 4715 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.590554 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5d9885b95b-r2cb2"] Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.605637 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa8c6cfe-dae5-455a-a3f4-83608f3064b5" (UID: "fa8c6cfe-dae5-455a-a3f4-83608f3064b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.661497 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5d9885b95b-r2cb2"] Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.666656 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.672495 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.673319 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb56b8fd-4a84-44aa-a3c9-80aefa10784e" (UID: "fb56b8fd-4a84-44aa-a3c9-80aefa10784e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.676025 4715 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.686525 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.688678 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-config-data" (OuterVolumeSpecName: "config-data") pod "fa8c6cfe-dae5-455a-a3f4-83608f3064b5" (UID: "fa8c6cfe-dae5-455a-a3f4-83608f3064b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.689495 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fa8c6cfe-dae5-455a-a3f4-83608f3064b5" (UID: "fa8c6cfe-dae5-455a-a3f4-83608f3064b5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.695464 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.695972 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" containerName="glance-log" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.696035 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" containerName="glance-log" Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.696103 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ca7802-fc81-4258-bfef-c598c9d65b2f" containerName="cinder-scheduler" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.696158 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ca7802-fc81-4258-bfef-c598c9d65b2f" containerName="cinder-scheduler" Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.696209 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="ceilometer-central-agent" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.696261 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="ceilometer-central-agent" Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.696311 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" containerName="glance-httpd" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.696363 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" containerName="glance-httpd" Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.696430 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="ceilometer-notification-agent" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.696481 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="ceilometer-notification-agent" Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.696529 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ca7802-fc81-4258-bfef-c598c9d65b2f" containerName="probe" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.696583 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ca7802-fc81-4258-bfef-c598c9d65b2f" containerName="probe" Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.696633 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb56b8fd-4a84-44aa-a3c9-80aefa10784e" containerName="dnsmasq-dns" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.696679 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb56b8fd-4a84-44aa-a3c9-80aefa10784e" containerName="dnsmasq-dns" Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.696733 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="sg-core" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.696786 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="sg-core" Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.696836 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="proxy-httpd" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.696887 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="proxy-httpd" Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.696945 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerName="horizon" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.696994 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerName="horizon" Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.697044 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb56b8fd-4a84-44aa-a3c9-80aefa10784e" containerName="init" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.697090 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb56b8fd-4a84-44aa-a3c9-80aefa10784e" containerName="init" Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.697136 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerName="horizon-log" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.697182 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerName="horizon-log" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.697390 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" containerName="glance-httpd" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.701689 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb56b8fd-4a84-44aa-a3c9-80aefa10784e" containerName="dnsmasq-dns" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.701789 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ca7802-fc81-4258-bfef-c598c9d65b2f" containerName="probe" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.701868 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="proxy-httpd" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.701946 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerName="horizon-log" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.702001 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="sg-core" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.702058 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" containerName="horizon" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.702110 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" containerName="glance-log" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.702180 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="ceilometer-central-agent" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.702252 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ca7802-fc81-4258-bfef-c598c9d65b2f" containerName="cinder-scheduler" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.702307 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" containerName="ceilometer-notification-agent" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.704088 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.710072 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.725503 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.741794 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.746271 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb56b8fd-4a84-44aa-a3c9-80aefa10784e" (UID: "fb56b8fd-4a84-44aa-a3c9-80aefa10784e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.755368 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-config" (OuterVolumeSpecName: "config") pod "fb56b8fd-4a84-44aa-a3c9-80aefa10784e" (UID: "fb56b8fd-4a84-44aa-a3c9-80aefa10784e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774003 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-run-httpd\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774054 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwbt5\" (UniqueName: \"kubernetes.io/projected/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-kube-api-access-cwbt5\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774089 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-scripts\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774193 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-log-httpd\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774213 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774316 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-config-data\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774371 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774459 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774478 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774490 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774523 4715 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8c6cfe-dae5-455a-a3f4-83608f3064b5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774533 4715 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.774543 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.813923 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08ca7802-fc81-4258-bfef-c598c9d65b2f" (UID: "08ca7802-fc81-4258-bfef-c598c9d65b2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.830215 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vtjlz"] Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.843065 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fb56b8fd-4a84-44aa-a3c9-80aefa10784e" (UID: "fb56b8fd-4a84-44aa-a3c9-80aefa10784e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.854904 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-28p9v"] Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.856951 4715 scope.go:117] "RemoveContainer" containerID="30df9f4b9017305ddf0ba538bc4401ae44d2c84af182eb159fea1745dd9773d7" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.862430 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb56b8fd-4a84-44aa-a3c9-80aefa10784e" (UID: "fb56b8fd-4a84-44aa-a3c9-80aefa10784e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.863976 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kp8jh"] Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.881029 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-log-httpd\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.881080 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.881207 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-config-data\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.881269 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.881321 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-run-httpd\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.881339 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwbt5\" (UniqueName: \"kubernetes.io/projected/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-kube-api-access-cwbt5\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.881871 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-scripts\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.882006 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.882019 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.882030 4715 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb56b8fd-4a84-44aa-a3c9-80aefa10784e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.884296 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-run-httpd\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.884452 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-log-httpd\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.888432 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-scripts\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.889208 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.901611 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-config-data\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.901950 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.914279 4715 scope.go:117] "RemoveContainer" containerID="de1bfe0badc20cbbb61faee7548dfd783c5d13dd7e2bbc2c662b7623ba5abc0f" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.915507 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-config-data" (OuterVolumeSpecName: "config-data") pod "08ca7802-fc81-4258-bfef-c598c9d65b2f" (UID: "08ca7802-fc81-4258-bfef-c598c9d65b2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.924724 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwbt5\" (UniqueName: \"kubernetes.io/projected/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-kube-api-access-cwbt5\") pod \"ceilometer-0\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " pod="openstack/ceilometer-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.942591 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.963747 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.975257 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.983035 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-logs\") pod \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.983181 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-scripts\") pod \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.983219 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-httpd-run\") pod \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.983256 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-config-data\") pod \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.983373 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-public-tls-certs\") pod \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.983408 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q2hl\" (UniqueName: \"kubernetes.io/projected/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-kube-api-access-7q2hl\") pod \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.983452 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-combined-ca-bundle\") pod \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.983472 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\" (UID: \"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1\") " Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.983843 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08ca7802-fc81-4258-bfef-c598c9d65b2f-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.987975 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" (UID: "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.988390 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-logs" (OuterVolumeSpecName: "logs") pod "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" (UID: "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.990596 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.991036 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" containerName="glance-log" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.991053 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" containerName="glance-log" Oct 09 08:04:36 crc kubenswrapper[4715]: E1009 08:04:36.991078 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" containerName="glance-httpd" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.991074 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-scripts" (OuterVolumeSpecName: "scripts") pod "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" (UID: "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.991086 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" containerName="glance-httpd" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.992412 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" containerName="glance-log" Oct 09 08:04:36 crc kubenswrapper[4715]: I1009 08:04:36.992483 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" containerName="glance-httpd" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.025405 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.032984 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.033265 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.058008 4715 scope.go:117] "RemoveContainer" containerID="89f7aa440f14f7dc7754a191a9b2beb81ba3c138a9fed3b12dd0b41025f51973" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.064978 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" (UID: "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.097318 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.134935 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-kube-api-access-7q2hl" (OuterVolumeSpecName: "kube-api-access-7q2hl") pod "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" (UID: "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1"). InnerVolumeSpecName "kube-api-access-7q2hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.154187 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.178109 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.180127 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8be2de3b-f820-4674-992a-9bf1a1735d6b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.180617 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be2de3b-f820-4674-992a-9bf1a1735d6b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.180965 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q2hl\" (UniqueName: \"kubernetes.io/projected/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-kube-api-access-7q2hl\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.181035 4715 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.181047 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.181056 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.183532 4715 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.210091 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" (UID: "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.246576 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" (UID: "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.252385 4715 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.287585 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-config-data" (OuterVolumeSpecName: "config-data") pod "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" (UID: "b155af9d-8b2d-4f91-8f9e-70f77dbb84f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.289104 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be2de3b-f820-4674-992a-9bf1a1735d6b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.289137 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8be2de3b-f820-4674-992a-9bf1a1735d6b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.289164 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be2de3b-f820-4674-992a-9bf1a1735d6b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.289196 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be2de3b-f820-4674-992a-9bf1a1735d6b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.289248 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.289272 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6x5v\" (UniqueName: \"kubernetes.io/projected/8be2de3b-f820-4674-992a-9bf1a1735d6b-kube-api-access-j6x5v\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.289312 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be2de3b-f820-4674-992a-9bf1a1735d6b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.289344 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8be2de3b-f820-4674-992a-9bf1a1735d6b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.289411 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.292476 4715 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.292505 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.292518 4715 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.292689 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8be2de3b-f820-4674-992a-9bf1a1735d6b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.293386 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: E1009 08:04:37.308876 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa8c6cfe_dae5_455a_a3f4_83608f3064b5.slice/crio-baf39a9f3334dc00fb6af8d58712a5b108840a8b359d9a6ba9ceedba94d50dba\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa8c6cfe_dae5_455a_a3f4_83608f3064b5.slice\": RecentStats: unable to find data in memory cache]" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.313078 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8be2de3b-f820-4674-992a-9bf1a1735d6b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.323858 4715 scope.go:117] "RemoveContainer" containerID="e0875b29b7c74d81e763ff018a1895e8fd2652900b8e9a903413e7c40d8540b5" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.376253 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.393768 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be2de3b-f820-4674-992a-9bf1a1735d6b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.393959 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be2de3b-f820-4674-992a-9bf1a1735d6b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.394060 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6x5v\" (UniqueName: \"kubernetes.io/projected/8be2de3b-f820-4674-992a-9bf1a1735d6b-kube-api-access-j6x5v\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.394161 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be2de3b-f820-4674-992a-9bf1a1735d6b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.394246 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8be2de3b-f820-4674-992a-9bf1a1735d6b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.394702 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8be2de3b-f820-4674-992a-9bf1a1735d6b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.401609 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8be2de3b-f820-4674-992a-9bf1a1735d6b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.415475 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8be2de3b-f820-4674-992a-9bf1a1735d6b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.417587 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be2de3b-f820-4674-992a-9bf1a1735d6b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.435393 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6x5v\" (UniqueName: \"kubernetes.io/projected/8be2de3b-f820-4674-992a-9bf1a1735d6b-kube-api-access-j6x5v\") pod \"glance-default-internal-api-0\" (UID: \"8be2de3b-f820-4674-992a-9bf1a1735d6b\") " pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.460653 4715 scope.go:117] "RemoveContainer" containerID="a412f5011889d1602ee9c6e8425ae5cfc6ec79df9d2ceeb5e45194df1b5ca559" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.462632 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.491800 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.549731 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.568592 4715 scope.go:117] "RemoveContainer" containerID="179f31ef5d83b939a24a6248b3f608ec2de7bb69229edbfd596d0fe2ec54c2e1" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.570472 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.572052 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.600487 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-7fpxg"] Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.611221 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.650146 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-7fpxg"] Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.658524 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.662607 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b155af9d-8b2d-4f91-8f9e-70f77dbb84f1","Type":"ContainerDied","Data":"cc8a7e84d62e1dce8a72340a833b10ee0a2911df43f37ffeef86f9b28eb53970"} Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.662712 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.683856 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.687611 4715 scope.go:117] "RemoveContainer" containerID="06e3a016ccf1feffe4e8bbc3185ff5bb6f65ac453d4e58b04b827106491f1919" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.702629 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48296f2f-dddc-4549-9f78-640128d54d46-config-data\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.702660 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48296f2f-dddc-4549-9f78-640128d54d46-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.702684 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48296f2f-dddc-4549-9f78-640128d54d46-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.702711 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qb2k\" (UniqueName: \"kubernetes.io/projected/48296f2f-dddc-4549-9f78-640128d54d46-kube-api-access-4qb2k\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.702777 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48296f2f-dddc-4549-9f78-640128d54d46-scripts\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.702926 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48296f2f-dddc-4549-9f78-640128d54d46-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.708051 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-28p9v" event={"ID":"80579efc-c70d-41a5-8a56-922ca09a8bd4","Type":"ContainerStarted","Data":"6c13773eb1603b00e356bc4b351cb7158f13e8ad40122ff4791df7246c6677bc"} Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.744378 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vtjlz" event={"ID":"12f445bb-022c-4dd9-8f91-e9612f526a12","Type":"ContainerStarted","Data":"c44b6aff5c37503f3b0e59779c74e5ce6e170344b885b2f26a38fa9ef3ff5ee6"} Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.749937 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.770797 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.771300 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kp8jh" event={"ID":"544d96bc-6a19-46c5-8162-64a99e333681","Type":"ContainerStarted","Data":"5b691cf1fa2ca6024711f4cf72b4e9a2f256d373b2b0ff230307589ebea55fac"} Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.779099 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.781283 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.791025 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.791209 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.804561 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qb2k\" (UniqueName: \"kubernetes.io/projected/48296f2f-dddc-4549-9f78-640128d54d46-kube-api-access-4qb2k\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.804659 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48296f2f-dddc-4549-9f78-640128d54d46-scripts\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.804690 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48296f2f-dddc-4549-9f78-640128d54d46-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.804826 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48296f2f-dddc-4549-9f78-640128d54d46-config-data\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.804907 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48296f2f-dddc-4549-9f78-640128d54d46-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.804930 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48296f2f-dddc-4549-9f78-640128d54d46-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.805920 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48296f2f-dddc-4549-9f78-640128d54d46-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.812839 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48296f2f-dddc-4549-9f78-640128d54d46-config-data\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.812887 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.814183 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48296f2f-dddc-4549-9f78-640128d54d46-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.815006 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48296f2f-dddc-4549-9f78-640128d54d46-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.820698 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48296f2f-dddc-4549-9f78-640128d54d46-scripts\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.820968 4715 scope.go:117] "RemoveContainer" containerID="1abb0d014478a89f86f7edba03186c78afcc89f7c2def49c541347d933b384c9" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.833102 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qb2k\" (UniqueName: \"kubernetes.io/projected/48296f2f-dddc-4549-9f78-640128d54d46-kube-api-access-4qb2k\") pod \"cinder-scheduler-0\" (UID: \"48296f2f-dddc-4549-9f78-640128d54d46\") " pod="openstack/cinder-scheduler-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.870607 4715 scope.go:117] "RemoveContainer" containerID="29c69c6bfea7f324ed2cb89b3d6fdc974a76c48f88d4ce8852af81d57091e21f" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.899629 4715 scope.go:117] "RemoveContainer" containerID="f66ee5e8144456200dbf81ee9e79d51c129e85e278fa1cd8441aebdf53debc5b" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.907541 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/351123b4-0e0c-413a-bc50-56f397c1b592-scripts\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.907618 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351123b4-0e0c-413a-bc50-56f397c1b592-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.907655 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfv5f\" (UniqueName: \"kubernetes.io/projected/351123b4-0e0c-413a-bc50-56f397c1b592-kube-api-access-cfv5f\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.907706 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/351123b4-0e0c-413a-bc50-56f397c1b592-logs\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.907741 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.907772 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/351123b4-0e0c-413a-bc50-56f397c1b592-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.907850 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/351123b4-0e0c-413a-bc50-56f397c1b592-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.907917 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/351123b4-0e0c-413a-bc50-56f397c1b592-config-data\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.931147 4715 scope.go:117] "RemoveContainer" containerID="2e03df95b6bb57fb4fb7f4d90cc8e45fe2c8a0dcc0049252dcba3306899bb1cf" Oct 09 08:04:37 crc kubenswrapper[4715]: I1009 08:04:37.936943 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.009474 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/351123b4-0e0c-413a-bc50-56f397c1b592-scripts\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.009778 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351123b4-0e0c-413a-bc50-56f397c1b592-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.009802 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfv5f\" (UniqueName: \"kubernetes.io/projected/351123b4-0e0c-413a-bc50-56f397c1b592-kube-api-access-cfv5f\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.009871 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/351123b4-0e0c-413a-bc50-56f397c1b592-logs\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.009899 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.009922 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/351123b4-0e0c-413a-bc50-56f397c1b592-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.009961 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/351123b4-0e0c-413a-bc50-56f397c1b592-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.010004 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/351123b4-0e0c-413a-bc50-56f397c1b592-config-data\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.011930 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/351123b4-0e0c-413a-bc50-56f397c1b592-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.011966 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/351123b4-0e0c-413a-bc50-56f397c1b592-logs\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.011453 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.017001 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/351123b4-0e0c-413a-bc50-56f397c1b592-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.018002 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/351123b4-0e0c-413a-bc50-56f397c1b592-scripts\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.019246 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/351123b4-0e0c-413a-bc50-56f397c1b592-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.021456 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/351123b4-0e0c-413a-bc50-56f397c1b592-config-data\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.040694 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfv5f\" (UniqueName: \"kubernetes.io/projected/351123b4-0e0c-413a-bc50-56f397c1b592-kube-api-access-cfv5f\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.053374 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"351123b4-0e0c-413a-bc50-56f397c1b592\") " pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.126883 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.157467 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08ca7802-fc81-4258-bfef-c598c9d65b2f" path="/var/lib/kubelet/pods/08ca7802-fc81-4258-bfef-c598c9d65b2f/volumes" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.158307 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada9982a-fc5f-4c93-bfa3-3401c0824c2e" path="/var/lib/kubelet/pods/ada9982a-fc5f-4c93-bfa3-3401c0824c2e/volumes" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.158977 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b155af9d-8b2d-4f91-8f9e-70f77dbb84f1" path="/var/lib/kubelet/pods/b155af9d-8b2d-4f91-8f9e-70f77dbb84f1/volumes" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.162734 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c0be8b-0711-488f-9bb1-edee8e92e527" path="/var/lib/kubelet/pods/f9c0be8b-0711-488f-9bb1-edee8e92e527/volumes" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.164930 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa8c6cfe-dae5-455a-a3f4-83608f3064b5" path="/var/lib/kubelet/pods/fa8c6cfe-dae5-455a-a3f4-83608f3064b5/volumes" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.165680 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb56b8fd-4a84-44aa-a3c9-80aefa10784e" path="/var/lib/kubelet/pods/fb56b8fd-4a84-44aa-a3c9-80aefa10784e/volumes" Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.323217 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 09 08:04:38 crc kubenswrapper[4715]: W1009 08:04:38.348789 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8be2de3b_f820_4674_992a_9bf1a1735d6b.slice/crio-9ba263394c45281a61cfdb28061ed789d978d61701ddb0bebd800f408d670a55 WatchSource:0}: Error finding container 9ba263394c45281a61cfdb28061ed789d978d61701ddb0bebd800f408d670a55: Status 404 returned error can't find the container with id 9ba263394c45281a61cfdb28061ed789d978d61701ddb0bebd800f408d670a55 Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.534912 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 09 08:04:38 crc kubenswrapper[4715]: W1009 08:04:38.557830 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48296f2f_dddc_4549_9f78_640128d54d46.slice/crio-c94dd86d7454cbd9a265e33de566367877690a30046f0cb8191cabb42b526ebb WatchSource:0}: Error finding container c94dd86d7454cbd9a265e33de566367877690a30046f0cb8191cabb42b526ebb: Status 404 returned error can't find the container with id c94dd86d7454cbd9a265e33de566367877690a30046f0cb8191cabb42b526ebb Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.791874 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7","Type":"ContainerStarted","Data":"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6"} Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.792122 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7","Type":"ContainerStarted","Data":"c9fde9fdd816b7f8353ca66dabff4c4e89c7a681f23c253ae6fb18700f488305"} Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.796845 4715 generic.go:334] "Generic (PLEG): container finished" podID="80579efc-c70d-41a5-8a56-922ca09a8bd4" containerID="d9e29a16fff26e5e088cbcc225e4d1032062bd08e096873954e1ba3dec79acd9" exitCode=0 Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.796900 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-28p9v" event={"ID":"80579efc-c70d-41a5-8a56-922ca09a8bd4","Type":"ContainerDied","Data":"d9e29a16fff26e5e088cbcc225e4d1032062bd08e096873954e1ba3dec79acd9"} Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.823599 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48296f2f-dddc-4549-9f78-640128d54d46","Type":"ContainerStarted","Data":"c94dd86d7454cbd9a265e33de566367877690a30046f0cb8191cabb42b526ebb"} Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.837805 4715 generic.go:334] "Generic (PLEG): container finished" podID="544d96bc-6a19-46c5-8162-64a99e333681" containerID="44bbd165b0581eccdab58e0d133dd9465cdd387dacb7d18d5d29576511c8478d" exitCode=0 Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.837886 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kp8jh" event={"ID":"544d96bc-6a19-46c5-8162-64a99e333681","Type":"ContainerDied","Data":"44bbd165b0581eccdab58e0d133dd9465cdd387dacb7d18d5d29576511c8478d"} Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.842243 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8be2de3b-f820-4674-992a-9bf1a1735d6b","Type":"ContainerStarted","Data":"9ba263394c45281a61cfdb28061ed789d978d61701ddb0bebd800f408d670a55"} Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.849499 4715 generic.go:334] "Generic (PLEG): container finished" podID="12f445bb-022c-4dd9-8f91-e9612f526a12" containerID="23d13f798740ab459a6412176221b61828a195e988a13fbe174d4f7a7c8b3bf6" exitCode=0 Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.849587 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vtjlz" event={"ID":"12f445bb-022c-4dd9-8f91-e9612f526a12","Type":"ContainerDied","Data":"23d13f798740ab459a6412176221b61828a195e988a13fbe174d4f7a7c8b3bf6"} Oct 09 08:04:38 crc kubenswrapper[4715]: I1009 08:04:38.876956 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 09 08:04:39 crc kubenswrapper[4715]: I1009 08:04:39.585815 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59756554bd-9q7xp" podUID="5823462b-a07e-4525-9be8-370dce870498" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:42630->10.217.0.165:9311: read: connection reset by peer" Oct 09 08:04:39 crc kubenswrapper[4715]: I1009 08:04:39.585824 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59756554bd-9q7xp" podUID="5823462b-a07e-4525-9be8-370dce870498" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:42638->10.217.0.165:9311: read: connection reset by peer" Oct 09 08:04:39 crc kubenswrapper[4715]: I1009 08:04:39.929046 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"351123b4-0e0c-413a-bc50-56f397c1b592","Type":"ContainerStarted","Data":"1ab4cae8b8719fb106f6b826684ccedad1883f425a829747830ad6c4bbe8fee8"} Oct 09 08:04:39 crc kubenswrapper[4715]: I1009 08:04:39.929092 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"351123b4-0e0c-413a-bc50-56f397c1b592","Type":"ContainerStarted","Data":"6bba977066b635a815d3c3b32fce9612bcf1e0569d729438cbdc8f3116cd8903"} Oct 09 08:04:39 crc kubenswrapper[4715]: I1009 08:04:39.934458 4715 generic.go:334] "Generic (PLEG): container finished" podID="5823462b-a07e-4525-9be8-370dce870498" containerID="ac6032261715516b1ba1cfe9851948275ebc4124882ad0151121b557b399837a" exitCode=0 Oct 09 08:04:39 crc kubenswrapper[4715]: I1009 08:04:39.934579 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59756554bd-9q7xp" event={"ID":"5823462b-a07e-4525-9be8-370dce870498","Type":"ContainerDied","Data":"ac6032261715516b1ba1cfe9851948275ebc4124882ad0151121b557b399837a"} Oct 09 08:04:39 crc kubenswrapper[4715]: I1009 08:04:39.945543 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7","Type":"ContainerStarted","Data":"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e"} Oct 09 08:04:39 crc kubenswrapper[4715]: I1009 08:04:39.953594 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48296f2f-dddc-4549-9f78-640128d54d46","Type":"ContainerStarted","Data":"4b4454e88f6f85c833923ec01b71a26cc047cdddd9852e5008311d5ecfcd59e8"} Oct 09 08:04:39 crc kubenswrapper[4715]: I1009 08:04:39.959553 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8be2de3b-f820-4674-992a-9bf1a1735d6b","Type":"ContainerStarted","Data":"a092c3740c6e8b59d8af78e83c0df1e808cd579bdc500045adeaeed6ba2e67f1"} Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.220014 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.250959 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.365489 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5823462b-a07e-4525-9be8-370dce870498-logs\") pod \"5823462b-a07e-4525-9be8-370dce870498\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.366436 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-combined-ca-bundle\") pod \"5823462b-a07e-4525-9be8-370dce870498\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.366550 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-config-data-custom\") pod \"5823462b-a07e-4525-9be8-370dce870498\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.366586 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-config-data\") pod \"5823462b-a07e-4525-9be8-370dce870498\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.366689 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgpvz\" (UniqueName: \"kubernetes.io/projected/5823462b-a07e-4525-9be8-370dce870498-kube-api-access-pgpvz\") pod \"5823462b-a07e-4525-9be8-370dce870498\" (UID: \"5823462b-a07e-4525-9be8-370dce870498\") " Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.368367 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5823462b-a07e-4525-9be8-370dce870498-logs" (OuterVolumeSpecName: "logs") pod "5823462b-a07e-4525-9be8-370dce870498" (UID: "5823462b-a07e-4525-9be8-370dce870498"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.395669 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5823462b-a07e-4525-9be8-370dce870498" (UID: "5823462b-a07e-4525-9be8-370dce870498"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.395783 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5823462b-a07e-4525-9be8-370dce870498-kube-api-access-pgpvz" (OuterVolumeSpecName: "kube-api-access-pgpvz") pod "5823462b-a07e-4525-9be8-370dce870498" (UID: "5823462b-a07e-4525-9be8-370dce870498"). InnerVolumeSpecName "kube-api-access-pgpvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.469567 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgpvz\" (UniqueName: \"kubernetes.io/projected/5823462b-a07e-4525-9be8-370dce870498-kube-api-access-pgpvz\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.469606 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5823462b-a07e-4525-9be8-370dce870498-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.469621 4715 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.568689 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5823462b-a07e-4525-9be8-370dce870498" (UID: "5823462b-a07e-4525-9be8-370dce870498"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.571371 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.593211 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-28p9v" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.630565 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-config-data" (OuterVolumeSpecName: "config-data") pod "5823462b-a07e-4525-9be8-370dce870498" (UID: "5823462b-a07e-4525-9be8-370dce870498"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.676115 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf2jw\" (UniqueName: \"kubernetes.io/projected/80579efc-c70d-41a5-8a56-922ca09a8bd4-kube-api-access-rf2jw\") pod \"80579efc-c70d-41a5-8a56-922ca09a8bd4\" (UID: \"80579efc-c70d-41a5-8a56-922ca09a8bd4\") " Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.676770 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5823462b-a07e-4525-9be8-370dce870498-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.693033 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80579efc-c70d-41a5-8a56-922ca09a8bd4-kube-api-access-rf2jw" (OuterVolumeSpecName: "kube-api-access-rf2jw") pod "80579efc-c70d-41a5-8a56-922ca09a8bd4" (UID: "80579efc-c70d-41a5-8a56-922ca09a8bd4"). InnerVolumeSpecName "kube-api-access-rf2jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.786137 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf2jw\" (UniqueName: \"kubernetes.io/projected/80579efc-c70d-41a5-8a56-922ca09a8bd4-kube-api-access-rf2jw\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.816999 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vtjlz" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.897984 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzhdp\" (UniqueName: \"kubernetes.io/projected/12f445bb-022c-4dd9-8f91-e9612f526a12-kube-api-access-gzhdp\") pod \"12f445bb-022c-4dd9-8f91-e9612f526a12\" (UID: \"12f445bb-022c-4dd9-8f91-e9612f526a12\") " Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.904699 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12f445bb-022c-4dd9-8f91-e9612f526a12-kube-api-access-gzhdp" (OuterVolumeSpecName: "kube-api-access-gzhdp") pod "12f445bb-022c-4dd9-8f91-e9612f526a12" (UID: "12f445bb-022c-4dd9-8f91-e9612f526a12"). InnerVolumeSpecName "kube-api-access-gzhdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:40 crc kubenswrapper[4715]: I1009 08:04:40.914201 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kp8jh" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.012409 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssdxz\" (UniqueName: \"kubernetes.io/projected/544d96bc-6a19-46c5-8162-64a99e333681-kube-api-access-ssdxz\") pod \"544d96bc-6a19-46c5-8162-64a99e333681\" (UID: \"544d96bc-6a19-46c5-8162-64a99e333681\") " Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.018084 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzhdp\" (UniqueName: \"kubernetes.io/projected/12f445bb-022c-4dd9-8f91-e9612f526a12-kube-api-access-gzhdp\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.023820 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/544d96bc-6a19-46c5-8162-64a99e333681-kube-api-access-ssdxz" (OuterVolumeSpecName: "kube-api-access-ssdxz") pod "544d96bc-6a19-46c5-8162-64a99e333681" (UID: "544d96bc-6a19-46c5-8162-64a99e333681"). InnerVolumeSpecName "kube-api-access-ssdxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.047819 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vtjlz" event={"ID":"12f445bb-022c-4dd9-8f91-e9612f526a12","Type":"ContainerDied","Data":"c44b6aff5c37503f3b0e59779c74e5ce6e170344b885b2f26a38fa9ef3ff5ee6"} Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.047875 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c44b6aff5c37503f3b0e59779c74e5ce6e170344b885b2f26a38fa9ef3ff5ee6" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.047959 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vtjlz" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.089049 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48296f2f-dddc-4549-9f78-640128d54d46","Type":"ContainerStarted","Data":"c02f2c01592371b81e0ab2afc6ca7455d21af42e815d861bb011b6f58e2f35a7"} Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.099478 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kp8jh" event={"ID":"544d96bc-6a19-46c5-8162-64a99e333681","Type":"ContainerDied","Data":"5b691cf1fa2ca6024711f4cf72b4e9a2f256d373b2b0ff230307589ebea55fac"} Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.099554 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b691cf1fa2ca6024711f4cf72b4e9a2f256d373b2b0ff230307589ebea55fac" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.099685 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kp8jh" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.120998 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssdxz\" (UniqueName: \"kubernetes.io/projected/544d96bc-6a19-46c5-8162-64a99e333681-kube-api-access-ssdxz\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.125550 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8be2de3b-f820-4674-992a-9bf1a1735d6b","Type":"ContainerStarted","Data":"2e43cf5ebfe938ec111bcf251220fc1c4c7793118d922019417c14dbe04ea2e9"} Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.129838 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59756554bd-9q7xp" event={"ID":"5823462b-a07e-4525-9be8-370dce870498","Type":"ContainerDied","Data":"b330576b3c5059ffffc82bc01fcecf1fa46805e23de4eacfcaa2cec2d4177551"} Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.129892 4715 scope.go:117] "RemoveContainer" containerID="ac6032261715516b1ba1cfe9851948275ebc4124882ad0151121b557b399837a" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.130010 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59756554bd-9q7xp" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.159359 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-28p9v" event={"ID":"80579efc-c70d-41a5-8a56-922ca09a8bd4","Type":"ContainerDied","Data":"6c13773eb1603b00e356bc4b351cb7158f13e8ad40122ff4791df7246c6677bc"} Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.159398 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c13773eb1603b00e356bc4b351cb7158f13e8ad40122ff4791df7246c6677bc" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.159562 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-28p9v" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.177717 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7","Type":"ContainerStarted","Data":"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07"} Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.189287 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.189268125 podStartE2EDuration="4.189268125s" podCreationTimestamp="2025-10-09 08:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:41.12426702 +0000 UTC m=+1111.817071028" watchObservedRunningTime="2025-10-09 08:04:41.189268125 +0000 UTC m=+1111.882072133" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.205273 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.205255866 podStartE2EDuration="5.205255866s" podCreationTimestamp="2025-10-09 08:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:41.17107428 +0000 UTC m=+1111.863878288" watchObservedRunningTime="2025-10-09 08:04:41.205255866 +0000 UTC m=+1111.898059874" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.219577 4715 scope.go:117] "RemoveContainer" containerID="16fdac3fb6c5d655e385897f80db5d30b67e106a81ba59d1fd09b8e88ae2486d" Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.231600 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59756554bd-9q7xp"] Oct 09 08:04:41 crc kubenswrapper[4715]: I1009 08:04:41.245914 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-59756554bd-9q7xp"] Oct 09 08:04:42 crc kubenswrapper[4715]: I1009 08:04:42.149073 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5823462b-a07e-4525-9be8-370dce870498" path="/var/lib/kubelet/pods/5823462b-a07e-4525-9be8-370dce870498/volumes" Oct 09 08:04:42 crc kubenswrapper[4715]: I1009 08:04:42.187042 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"351123b4-0e0c-413a-bc50-56f397c1b592","Type":"ContainerStarted","Data":"d672a2b58fda8e6cdd48b454d527e2a146fe3797a53a49a371ed3e1f2049596b"} Oct 09 08:04:42 crc kubenswrapper[4715]: I1009 08:04:42.192728 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7","Type":"ContainerStarted","Data":"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78"} Oct 09 08:04:42 crc kubenswrapper[4715]: I1009 08:04:42.193109 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 08:04:42 crc kubenswrapper[4715]: I1009 08:04:42.217628 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.217601322 podStartE2EDuration="5.217601322s" podCreationTimestamp="2025-10-09 08:04:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:42.214447761 +0000 UTC m=+1112.907251769" watchObservedRunningTime="2025-10-09 08:04:42.217601322 +0000 UTC m=+1112.910405330" Oct 09 08:04:42 crc kubenswrapper[4715]: I1009 08:04:42.244774 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.320457661 podStartE2EDuration="6.244753035s" podCreationTimestamp="2025-10-09 08:04:36 +0000 UTC" firstStartedPulling="2025-10-09 08:04:37.718213102 +0000 UTC m=+1108.411017110" lastFinishedPulling="2025-10-09 08:04:41.642508476 +0000 UTC m=+1112.335312484" observedRunningTime="2025-10-09 08:04:42.234746296 +0000 UTC m=+1112.927550314" watchObservedRunningTime="2025-10-09 08:04:42.244753035 +0000 UTC m=+1112.937557063" Oct 09 08:04:42 crc kubenswrapper[4715]: I1009 08:04:42.644866 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-85df7c4d7c-7ktz2" Oct 09 08:04:42 crc kubenswrapper[4715]: I1009 08:04:42.695873 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d4d4f746b-b9w44"] Oct 09 08:04:42 crc kubenswrapper[4715]: I1009 08:04:42.696399 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d4d4f746b-b9w44" podUID="ca660681-8c14-4632-91ee-9abeeb3ef48e" containerName="neutron-api" containerID="cri-o://4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807" gracePeriod=30 Oct 09 08:04:42 crc kubenswrapper[4715]: I1009 08:04:42.696698 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d4d4f746b-b9w44" podUID="ca660681-8c14-4632-91ee-9abeeb3ef48e" containerName="neutron-httpd" containerID="cri-o://23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838" gracePeriod=30 Oct 09 08:04:42 crc kubenswrapper[4715]: I1009 08:04:42.938513 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 09 08:04:43 crc kubenswrapper[4715]: I1009 08:04:43.205445 4715 generic.go:334] "Generic (PLEG): container finished" podID="ca660681-8c14-4632-91ee-9abeeb3ef48e" containerID="23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838" exitCode=0 Oct 09 08:04:43 crc kubenswrapper[4715]: I1009 08:04:43.205562 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d4d4f746b-b9w44" event={"ID":"ca660681-8c14-4632-91ee-9abeeb3ef48e","Type":"ContainerDied","Data":"23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838"} Oct 09 08:04:45 crc kubenswrapper[4715]: I1009 08:04:45.884624 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.007937 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-combined-ca-bundle\") pod \"ca660681-8c14-4632-91ee-9abeeb3ef48e\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.008038 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-httpd-config\") pod \"ca660681-8c14-4632-91ee-9abeeb3ef48e\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.008237 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qsbs\" (UniqueName: \"kubernetes.io/projected/ca660681-8c14-4632-91ee-9abeeb3ef48e-kube-api-access-5qsbs\") pod \"ca660681-8c14-4632-91ee-9abeeb3ef48e\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.009005 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-config\") pod \"ca660681-8c14-4632-91ee-9abeeb3ef48e\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.009054 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-ovndb-tls-certs\") pod \"ca660681-8c14-4632-91ee-9abeeb3ef48e\" (UID: \"ca660681-8c14-4632-91ee-9abeeb3ef48e\") " Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.021621 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ca660681-8c14-4632-91ee-9abeeb3ef48e" (UID: "ca660681-8c14-4632-91ee-9abeeb3ef48e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.021673 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca660681-8c14-4632-91ee-9abeeb3ef48e-kube-api-access-5qsbs" (OuterVolumeSpecName: "kube-api-access-5qsbs") pod "ca660681-8c14-4632-91ee-9abeeb3ef48e" (UID: "ca660681-8c14-4632-91ee-9abeeb3ef48e"). InnerVolumeSpecName "kube-api-access-5qsbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.073961 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-config" (OuterVolumeSpecName: "config") pod "ca660681-8c14-4632-91ee-9abeeb3ef48e" (UID: "ca660681-8c14-4632-91ee-9abeeb3ef48e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.082325 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca660681-8c14-4632-91ee-9abeeb3ef48e" (UID: "ca660681-8c14-4632-91ee-9abeeb3ef48e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.105755 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ca660681-8c14-4632-91ee-9abeeb3ef48e" (UID: "ca660681-8c14-4632-91ee-9abeeb3ef48e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.111977 4715 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.112020 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qsbs\" (UniqueName: \"kubernetes.io/projected/ca660681-8c14-4632-91ee-9abeeb3ef48e-kube-api-access-5qsbs\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.112036 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.112047 4715 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.112059 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca660681-8c14-4632-91ee-9abeeb3ef48e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.233771 4715 generic.go:334] "Generic (PLEG): container finished" podID="ca660681-8c14-4632-91ee-9abeeb3ef48e" containerID="4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807" exitCode=0 Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.233825 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d4d4f746b-b9w44" event={"ID":"ca660681-8c14-4632-91ee-9abeeb3ef48e","Type":"ContainerDied","Data":"4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807"} Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.233858 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d4d4f746b-b9w44" event={"ID":"ca660681-8c14-4632-91ee-9abeeb3ef48e","Type":"ContainerDied","Data":"9644c01f63899ca6a6da09769f0fd0993a46a4f4c7e0cefbf3b111e16303bcaa"} Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.233882 4715 scope.go:117] "RemoveContainer" containerID="23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.234029 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d4d4f746b-b9w44" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.259455 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d4d4f746b-b9w44"] Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.259854 4715 scope.go:117] "RemoveContainer" containerID="4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.270629 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5d4d4f746b-b9w44"] Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.283986 4715 scope.go:117] "RemoveContainer" containerID="23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838" Oct 09 08:04:46 crc kubenswrapper[4715]: E1009 08:04:46.284398 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838\": container with ID starting with 23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838 not found: ID does not exist" containerID="23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.284447 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838"} err="failed to get container status \"23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838\": rpc error: code = NotFound desc = could not find container \"23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838\": container with ID starting with 23ae14f1919334d23f4719037ffbfbd5461a8a98b74faff42ad4870bfc84a838 not found: ID does not exist" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.284469 4715 scope.go:117] "RemoveContainer" containerID="4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807" Oct 09 08:04:46 crc kubenswrapper[4715]: E1009 08:04:46.284706 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807\": container with ID starting with 4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807 not found: ID does not exist" containerID="4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807" Oct 09 08:04:46 crc kubenswrapper[4715]: I1009 08:04:46.284721 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807"} err="failed to get container status \"4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807\": rpc error: code = NotFound desc = could not find container \"4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807\": container with ID starting with 4b65a435d5ac444283309c5da8927cdab72bf5f361f77d8bad5ba625090e6807 not found: ID does not exist" Oct 09 08:04:47 crc kubenswrapper[4715]: I1009 08:04:47.409459 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:47 crc kubenswrapper[4715]: I1009 08:04:47.409998 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="ceilometer-central-agent" containerID="cri-o://96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6" gracePeriod=30 Oct 09 08:04:47 crc kubenswrapper[4715]: I1009 08:04:47.410091 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="ceilometer-notification-agent" containerID="cri-o://bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e" gracePeriod=30 Oct 09 08:04:47 crc kubenswrapper[4715]: I1009 08:04:47.410098 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="sg-core" containerID="cri-o://83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07" gracePeriod=30 Oct 09 08:04:47 crc kubenswrapper[4715]: I1009 08:04:47.410098 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="proxy-httpd" containerID="cri-o://490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78" gracePeriod=30 Oct 09 08:04:47 crc kubenswrapper[4715]: I1009 08:04:47.463491 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:47 crc kubenswrapper[4715]: I1009 08:04:47.463555 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:47 crc kubenswrapper[4715]: I1009 08:04:47.502400 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:47 crc kubenswrapper[4715]: I1009 08:04:47.507310 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:47 crc kubenswrapper[4715]: E1009 08:04:47.591928 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cf7cb12_56ff_4d2c_a1a3_eda246e226d7.slice/crio-83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cf7cb12_56ff_4d2c_a1a3_eda246e226d7.slice/crio-conmon-83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07.scope\": RecentStats: unable to find data in memory cache]" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.127920 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.128155 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.152965 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca660681-8c14-4632-91ee-9abeeb3ef48e" path="/var/lib/kubelet/pods/ca660681-8c14-4632-91ee-9abeeb3ef48e/volumes" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.159314 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.172331 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.173021 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.173813 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.255393 4715 generic.go:334] "Generic (PLEG): container finished" podID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerID="490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78" exitCode=0 Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.255432 4715 generic.go:334] "Generic (PLEG): container finished" podID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerID="83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07" exitCode=2 Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.255440 4715 generic.go:334] "Generic (PLEG): container finished" podID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerID="bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e" exitCode=0 Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.255446 4715 generic.go:334] "Generic (PLEG): container finished" podID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerID="96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6" exitCode=0 Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.255822 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.255999 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7","Type":"ContainerDied","Data":"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78"} Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.256038 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7","Type":"ContainerDied","Data":"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07"} Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.256049 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7","Type":"ContainerDied","Data":"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e"} Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.256058 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7","Type":"ContainerDied","Data":"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6"} Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.256066 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7","Type":"ContainerDied","Data":"c9fde9fdd816b7f8353ca66dabff4c4e89c7a681f23c253ae6fb18700f488305"} Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.256081 4715 scope.go:117] "RemoveContainer" containerID="490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.256771 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.256898 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.256987 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.257062 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.266570 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-scripts\") pod \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.266640 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwbt5\" (UniqueName: \"kubernetes.io/projected/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-kube-api-access-cwbt5\") pod \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.266683 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-config-data\") pod \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.266728 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-log-httpd\") pod \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.266838 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-sg-core-conf-yaml\") pod \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.266861 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-combined-ca-bundle\") pod \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.266898 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-run-httpd\") pod \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\" (UID: \"9cf7cb12-56ff-4d2c-a1a3-eda246e226d7\") " Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.267853 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" (UID: "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.268338 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" (UID: "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.273679 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-kube-api-access-cwbt5" (OuterVolumeSpecName: "kube-api-access-cwbt5") pod "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" (UID: "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7"). InnerVolumeSpecName "kube-api-access-cwbt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.273779 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-scripts" (OuterVolumeSpecName: "scripts") pod "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" (UID: "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.282619 4715 scope.go:117] "RemoveContainer" containerID="83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.296788 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" (UID: "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.309714 4715 scope.go:117] "RemoveContainer" containerID="bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.330963 4715 scope.go:117] "RemoveContainer" containerID="96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.348206 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" (UID: "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.352370 4715 scope.go:117] "RemoveContainer" containerID="490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.352822 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78\": container with ID starting with 490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78 not found: ID does not exist" containerID="490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.352857 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78"} err="failed to get container status \"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78\": rpc error: code = NotFound desc = could not find container \"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78\": container with ID starting with 490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78 not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.352876 4715 scope.go:117] "RemoveContainer" containerID="83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.353201 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07\": container with ID starting with 83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07 not found: ID does not exist" containerID="83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.353223 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07"} err="failed to get container status \"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07\": rpc error: code = NotFound desc = could not find container \"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07\": container with ID starting with 83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07 not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.353236 4715 scope.go:117] "RemoveContainer" containerID="bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.353610 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e\": container with ID starting with bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e not found: ID does not exist" containerID="bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.353704 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e"} err="failed to get container status \"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e\": rpc error: code = NotFound desc = could not find container \"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e\": container with ID starting with bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.353751 4715 scope.go:117] "RemoveContainer" containerID="96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.354069 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6\": container with ID starting with 96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6 not found: ID does not exist" containerID="96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.354106 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6"} err="failed to get container status \"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6\": rpc error: code = NotFound desc = could not find container \"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6\": container with ID starting with 96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6 not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.354131 4715 scope.go:117] "RemoveContainer" containerID="490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.354402 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78"} err="failed to get container status \"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78\": rpc error: code = NotFound desc = could not find container \"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78\": container with ID starting with 490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78 not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.354438 4715 scope.go:117] "RemoveContainer" containerID="83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.354749 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07"} err="failed to get container status \"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07\": rpc error: code = NotFound desc = could not find container \"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07\": container with ID starting with 83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07 not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.354768 4715 scope.go:117] "RemoveContainer" containerID="bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.355001 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e"} err="failed to get container status \"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e\": rpc error: code = NotFound desc = could not find container \"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e\": container with ID starting with bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.355024 4715 scope.go:117] "RemoveContainer" containerID="96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.355209 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6"} err="failed to get container status \"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6\": rpc error: code = NotFound desc = could not find container \"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6\": container with ID starting with 96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6 not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.355226 4715 scope.go:117] "RemoveContainer" containerID="490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.355385 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78"} err="failed to get container status \"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78\": rpc error: code = NotFound desc = could not find container \"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78\": container with ID starting with 490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78 not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.355404 4715 scope.go:117] "RemoveContainer" containerID="83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.355591 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07"} err="failed to get container status \"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07\": rpc error: code = NotFound desc = could not find container \"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07\": container with ID starting with 83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07 not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.355622 4715 scope.go:117] "RemoveContainer" containerID="bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.355887 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e"} err="failed to get container status \"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e\": rpc error: code = NotFound desc = could not find container \"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e\": container with ID starting with bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.355908 4715 scope.go:117] "RemoveContainer" containerID="96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.356161 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6"} err="failed to get container status \"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6\": rpc error: code = NotFound desc = could not find container \"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6\": container with ID starting with 96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6 not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.356187 4715 scope.go:117] "RemoveContainer" containerID="490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.356463 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78"} err="failed to get container status \"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78\": rpc error: code = NotFound desc = could not find container \"490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78\": container with ID starting with 490df6a0538b6a5debabe5e09180bff75a45ebf9e66be73704432275a34ecd78 not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.356482 4715 scope.go:117] "RemoveContainer" containerID="83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.356711 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07"} err="failed to get container status \"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07\": rpc error: code = NotFound desc = could not find container \"83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07\": container with ID starting with 83fd1178c2d1843f8fc2b73f43a8383e07e7e9033b03a4bc10057ddf04d8fd07 not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.356728 4715 scope.go:117] "RemoveContainer" containerID="bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.357061 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e"} err="failed to get container status \"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e\": rpc error: code = NotFound desc = could not find container \"bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e\": container with ID starting with bcdebad9220c2286c154e6f4586da47c8a8837582302b4544df3142816b7ad9e not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.357096 4715 scope.go:117] "RemoveContainer" containerID="96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.357297 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6"} err="failed to get container status \"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6\": rpc error: code = NotFound desc = could not find container \"96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6\": container with ID starting with 96ffd806f8eff352f1cecb41072f554abe29891e730e29f34d25f8cd29a39fa6 not found: ID does not exist" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.369328 4715 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.369361 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.369373 4715 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.369384 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.369396 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwbt5\" (UniqueName: \"kubernetes.io/projected/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-kube-api-access-cwbt5\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.369407 4715 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.380106 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-config-data" (OuterVolumeSpecName: "config-data") pod "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" (UID: "9cf7cb12-56ff-4d2c-a1a3-eda246e226d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.471481 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.613822 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.626047 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.642326 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.642705 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5823462b-a07e-4525-9be8-370dce870498" containerName="barbican-api" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.642720 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5823462b-a07e-4525-9be8-370dce870498" containerName="barbican-api" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.642743 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca660681-8c14-4632-91ee-9abeeb3ef48e" containerName="neutron-httpd" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.642750 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca660681-8c14-4632-91ee-9abeeb3ef48e" containerName="neutron-httpd" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.642761 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="544d96bc-6a19-46c5-8162-64a99e333681" containerName="mariadb-database-create" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.642768 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="544d96bc-6a19-46c5-8162-64a99e333681" containerName="mariadb-database-create" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.642777 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="ceilometer-notification-agent" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.642782 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="ceilometer-notification-agent" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.642794 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="proxy-httpd" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.642800 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="proxy-httpd" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.642808 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12f445bb-022c-4dd9-8f91-e9612f526a12" containerName="mariadb-database-create" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.642814 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="12f445bb-022c-4dd9-8f91-e9612f526a12" containerName="mariadb-database-create" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.642826 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="sg-core" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.642832 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="sg-core" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.642847 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80579efc-c70d-41a5-8a56-922ca09a8bd4" containerName="mariadb-database-create" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.642853 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="80579efc-c70d-41a5-8a56-922ca09a8bd4" containerName="mariadb-database-create" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.642866 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5823462b-a07e-4525-9be8-370dce870498" containerName="barbican-api-log" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.642872 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5823462b-a07e-4525-9be8-370dce870498" containerName="barbican-api-log" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.642885 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca660681-8c14-4632-91ee-9abeeb3ef48e" containerName="neutron-api" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.642893 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca660681-8c14-4632-91ee-9abeeb3ef48e" containerName="neutron-api" Oct 09 08:04:48 crc kubenswrapper[4715]: E1009 08:04:48.642907 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="ceilometer-central-agent" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.642913 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="ceilometer-central-agent" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.643074 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca660681-8c14-4632-91ee-9abeeb3ef48e" containerName="neutron-httpd" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.643086 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="544d96bc-6a19-46c5-8162-64a99e333681" containerName="mariadb-database-create" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.643095 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca660681-8c14-4632-91ee-9abeeb3ef48e" containerName="neutron-api" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.643107 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="12f445bb-022c-4dd9-8f91-e9612f526a12" containerName="mariadb-database-create" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.643117 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="5823462b-a07e-4525-9be8-370dce870498" containerName="barbican-api" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.643128 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="5823462b-a07e-4525-9be8-370dce870498" containerName="barbican-api-log" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.643142 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="80579efc-c70d-41a5-8a56-922ca09a8bd4" containerName="mariadb-database-create" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.643154 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="sg-core" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.643169 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="ceilometer-central-agent" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.643181 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="ceilometer-notification-agent" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.643190 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" containerName="proxy-httpd" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.644906 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.646755 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.652739 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.659901 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.775898 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.775954 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/740413d6-f9d3-42d6-9459-66f79ee018ed-log-httpd\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.775992 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-config-data\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.776084 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/740413d6-f9d3-42d6-9459-66f79ee018ed-run-httpd\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.776116 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qpqk\" (UniqueName: \"kubernetes.io/projected/740413d6-f9d3-42d6-9459-66f79ee018ed-kube-api-access-4qpqk\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.776160 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-scripts\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.776191 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.878262 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.878316 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/740413d6-f9d3-42d6-9459-66f79ee018ed-log-httpd\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.878352 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-config-data\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.878431 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/740413d6-f9d3-42d6-9459-66f79ee018ed-run-httpd\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.878453 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qpqk\" (UniqueName: \"kubernetes.io/projected/740413d6-f9d3-42d6-9459-66f79ee018ed-kube-api-access-4qpqk\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.878495 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-scripts\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.878525 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.878859 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/740413d6-f9d3-42d6-9459-66f79ee018ed-run-httpd\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.879408 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/740413d6-f9d3-42d6-9459-66f79ee018ed-log-httpd\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.881942 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.882572 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-config-data\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.882835 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.888598 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-scripts\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.906051 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qpqk\" (UniqueName: \"kubernetes.io/projected/740413d6-f9d3-42d6-9459-66f79ee018ed-kube-api-access-4qpqk\") pod \"ceilometer-0\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " pod="openstack/ceilometer-0" Oct 09 08:04:48 crc kubenswrapper[4715]: I1009 08:04:48.961336 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:49 crc kubenswrapper[4715]: I1009 08:04:49.469577 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:50 crc kubenswrapper[4715]: I1009 08:04:50.150682 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cf7cb12-56ff-4d2c-a1a3-eda246e226d7" path="/var/lib/kubelet/pods/9cf7cb12-56ff-4d2c-a1a3-eda246e226d7/volumes" Oct 09 08:04:50 crc kubenswrapper[4715]: I1009 08:04:50.273917 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"740413d6-f9d3-42d6-9459-66f79ee018ed","Type":"ContainerStarted","Data":"95ec11899dee865fe59ab437cbca4d90b99ea1cbf62aa17f97f17734d4fb5d8c"} Oct 09 08:04:50 crc kubenswrapper[4715]: I1009 08:04:50.273963 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 08:04:50 crc kubenswrapper[4715]: I1009 08:04:50.273998 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 08:04:50 crc kubenswrapper[4715]: I1009 08:04:50.274002 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 08:04:50 crc kubenswrapper[4715]: I1009 08:04:50.274017 4715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 09 08:04:50 crc kubenswrapper[4715]: I1009 08:04:50.273967 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"740413d6-f9d3-42d6-9459-66f79ee018ed","Type":"ContainerStarted","Data":"a0ce0ea835c0a7ec86e12e51f2cbc5d2a460fa7579f21d4c9b643c56ec05fa17"} Oct 09 08:04:50 crc kubenswrapper[4715]: I1009 08:04:50.288981 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:50 crc kubenswrapper[4715]: I1009 08:04:50.293598 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 09 08:04:50 crc kubenswrapper[4715]: I1009 08:04:50.345172 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 08:04:50 crc kubenswrapper[4715]: I1009 08:04:50.350123 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 09 08:04:51 crc kubenswrapper[4715]: I1009 08:04:51.286077 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"740413d6-f9d3-42d6-9459-66f79ee018ed","Type":"ContainerStarted","Data":"6e523cf4d71846d177f25138762646a785958b1251a7d10079053a9bf250a35d"} Oct 09 08:04:51 crc kubenswrapper[4715]: I1009 08:04:51.287280 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ea37012b-c593-4cd0-8501-121c791b2741","Type":"ContainerStarted","Data":"be415824fade9d5e6cd254d3bfc1839d112739bb18dc86686e461dd4ff2971fe"} Oct 09 08:04:51 crc kubenswrapper[4715]: I1009 08:04:51.305687 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.873586677 podStartE2EDuration="35.305667265s" podCreationTimestamp="2025-10-09 08:04:16 +0000 UTC" firstStartedPulling="2025-10-09 08:04:17.14760542 +0000 UTC m=+1087.840409428" lastFinishedPulling="2025-10-09 08:04:50.579686008 +0000 UTC m=+1121.272490016" observedRunningTime="2025-10-09 08:04:51.298731685 +0000 UTC m=+1121.991535693" watchObservedRunningTime="2025-10-09 08:04:51.305667265 +0000 UTC m=+1121.998471273" Oct 09 08:04:52 crc kubenswrapper[4715]: I1009 08:04:52.298160 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"740413d6-f9d3-42d6-9459-66f79ee018ed","Type":"ContainerStarted","Data":"c5a2370ccee0b27f909f8e4b6e6600314625075b13b85852fa1c16af124449a7"} Oct 09 08:04:52 crc kubenswrapper[4715]: I1009 08:04:52.902283 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c146-account-create-gbjbj"] Oct 09 08:04:52 crc kubenswrapper[4715]: I1009 08:04:52.904277 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c146-account-create-gbjbj" Oct 09 08:04:52 crc kubenswrapper[4715]: I1009 08:04:52.907029 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 09 08:04:52 crc kubenswrapper[4715]: I1009 08:04:52.914137 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c146-account-create-gbjbj"] Oct 09 08:04:52 crc kubenswrapper[4715]: I1009 08:04:52.986624 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7k8c\" (UniqueName: \"kubernetes.io/projected/9bb3c1ec-082e-4c72-b922-fee185aa0b44-kube-api-access-h7k8c\") pod \"nova-api-c146-account-create-gbjbj\" (UID: \"9bb3c1ec-082e-4c72-b922-fee185aa0b44\") " pod="openstack/nova-api-c146-account-create-gbjbj" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.088466 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7k8c\" (UniqueName: \"kubernetes.io/projected/9bb3c1ec-082e-4c72-b922-fee185aa0b44-kube-api-access-h7k8c\") pod \"nova-api-c146-account-create-gbjbj\" (UID: \"9bb3c1ec-082e-4c72-b922-fee185aa0b44\") " pod="openstack/nova-api-c146-account-create-gbjbj" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.091595 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4765-account-create-49j2w"] Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.092984 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4765-account-create-49j2w" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.100443 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.118435 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7k8c\" (UniqueName: \"kubernetes.io/projected/9bb3c1ec-082e-4c72-b922-fee185aa0b44-kube-api-access-h7k8c\") pod \"nova-api-c146-account-create-gbjbj\" (UID: \"9bb3c1ec-082e-4c72-b922-fee185aa0b44\") " pod="openstack/nova-api-c146-account-create-gbjbj" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.140061 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4765-account-create-49j2w"] Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.191452 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqbsf\" (UniqueName: \"kubernetes.io/projected/98e40342-58d1-4d38-9422-ade8716c4a55-kube-api-access-kqbsf\") pod \"nova-cell0-4765-account-create-49j2w\" (UID: \"98e40342-58d1-4d38-9422-ade8716c4a55\") " pod="openstack/nova-cell0-4765-account-create-49j2w" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.226775 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c146-account-create-gbjbj" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.295220 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqbsf\" (UniqueName: \"kubernetes.io/projected/98e40342-58d1-4d38-9422-ade8716c4a55-kube-api-access-kqbsf\") pod \"nova-cell0-4765-account-create-49j2w\" (UID: \"98e40342-58d1-4d38-9422-ade8716c4a55\") " pod="openstack/nova-cell0-4765-account-create-49j2w" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.302402 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-af8d-account-create-4sbml"] Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.303635 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-af8d-account-create-4sbml" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.313852 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-af8d-account-create-4sbml"] Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.316368 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.321001 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqbsf\" (UniqueName: \"kubernetes.io/projected/98e40342-58d1-4d38-9422-ade8716c4a55-kube-api-access-kqbsf\") pod \"nova-cell0-4765-account-create-49j2w\" (UID: \"98e40342-58d1-4d38-9422-ade8716c4a55\") " pod="openstack/nova-cell0-4765-account-create-49j2w" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.325450 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"740413d6-f9d3-42d6-9459-66f79ee018ed","Type":"ContainerStarted","Data":"93c5712f9c3437dd672e38e3d2920d377ebcb0d946734230f0e5cc22302378d8"} Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.325702 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.351910 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.106602375 podStartE2EDuration="5.351887337s" podCreationTimestamp="2025-10-09 08:04:48 +0000 UTC" firstStartedPulling="2025-10-09 08:04:49.482460955 +0000 UTC m=+1120.175264963" lastFinishedPulling="2025-10-09 08:04:52.727745917 +0000 UTC m=+1123.420549925" observedRunningTime="2025-10-09 08:04:53.351628539 +0000 UTC m=+1124.044432547" watchObservedRunningTime="2025-10-09 08:04:53.351887337 +0000 UTC m=+1124.044691355" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.398111 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vp5t\" (UniqueName: \"kubernetes.io/projected/d13eea75-c5e8-49f0-99c0-def950b3e0fa-kube-api-access-8vp5t\") pod \"nova-cell1-af8d-account-create-4sbml\" (UID: \"d13eea75-c5e8-49f0-99c0-def950b3e0fa\") " pod="openstack/nova-cell1-af8d-account-create-4sbml" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.424276 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4765-account-create-49j2w" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.506925 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vp5t\" (UniqueName: \"kubernetes.io/projected/d13eea75-c5e8-49f0-99c0-def950b3e0fa-kube-api-access-8vp5t\") pod \"nova-cell1-af8d-account-create-4sbml\" (UID: \"d13eea75-c5e8-49f0-99c0-def950b3e0fa\") " pod="openstack/nova-cell1-af8d-account-create-4sbml" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.524571 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vp5t\" (UniqueName: \"kubernetes.io/projected/d13eea75-c5e8-49f0-99c0-def950b3e0fa-kube-api-access-8vp5t\") pod \"nova-cell1-af8d-account-create-4sbml\" (UID: \"d13eea75-c5e8-49f0-99c0-def950b3e0fa\") " pod="openstack/nova-cell1-af8d-account-create-4sbml" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.636025 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-af8d-account-create-4sbml" Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.790865 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c146-account-create-gbjbj"] Oct 09 08:04:53 crc kubenswrapper[4715]: W1009 08:04:53.795903 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bb3c1ec_082e_4c72_b922_fee185aa0b44.slice/crio-77bd1e4c7a390413f3502e837d8d29f06f2090d13e595cdaf22762fc3dfc9016 WatchSource:0}: Error finding container 77bd1e4c7a390413f3502e837d8d29f06f2090d13e595cdaf22762fc3dfc9016: Status 404 returned error can't find the container with id 77bd1e4c7a390413f3502e837d8d29f06f2090d13e595cdaf22762fc3dfc9016 Oct 09 08:04:53 crc kubenswrapper[4715]: I1009 08:04:53.955704 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4765-account-create-49j2w"] Oct 09 08:04:53 crc kubenswrapper[4715]: W1009 08:04:53.982762 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e40342_58d1_4d38_9422_ade8716c4a55.slice/crio-6c08694bd21ed4ff71677c7f4a5e730086e584135f6a3d6a11bf407dbd19acc3 WatchSource:0}: Error finding container 6c08694bd21ed4ff71677c7f4a5e730086e584135f6a3d6a11bf407dbd19acc3: Status 404 returned error can't find the container with id 6c08694bd21ed4ff71677c7f4a5e730086e584135f6a3d6a11bf407dbd19acc3 Oct 09 08:04:54 crc kubenswrapper[4715]: I1009 08:04:54.158871 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-af8d-account-create-4sbml"] Oct 09 08:04:54 crc kubenswrapper[4715]: I1009 08:04:54.341979 4715 generic.go:334] "Generic (PLEG): container finished" podID="9bb3c1ec-082e-4c72-b922-fee185aa0b44" containerID="e99e3d7744469aa877b76e70adf0f30179857d5cdf8bb9c2392c0aade85d5d58" exitCode=0 Oct 09 08:04:54 crc kubenswrapper[4715]: I1009 08:04:54.342070 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c146-account-create-gbjbj" event={"ID":"9bb3c1ec-082e-4c72-b922-fee185aa0b44","Type":"ContainerDied","Data":"e99e3d7744469aa877b76e70adf0f30179857d5cdf8bb9c2392c0aade85d5d58"} Oct 09 08:04:54 crc kubenswrapper[4715]: I1009 08:04:54.342449 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c146-account-create-gbjbj" event={"ID":"9bb3c1ec-082e-4c72-b922-fee185aa0b44","Type":"ContainerStarted","Data":"77bd1e4c7a390413f3502e837d8d29f06f2090d13e595cdaf22762fc3dfc9016"} Oct 09 08:04:54 crc kubenswrapper[4715]: I1009 08:04:54.344685 4715 generic.go:334] "Generic (PLEG): container finished" podID="98e40342-58d1-4d38-9422-ade8716c4a55" containerID="d4dfdcdbb128b61403d55abeaa594c743967e826fccde846e76205b760e1824b" exitCode=0 Oct 09 08:04:54 crc kubenswrapper[4715]: I1009 08:04:54.344761 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4765-account-create-49j2w" event={"ID":"98e40342-58d1-4d38-9422-ade8716c4a55","Type":"ContainerDied","Data":"d4dfdcdbb128b61403d55abeaa594c743967e826fccde846e76205b760e1824b"} Oct 09 08:04:54 crc kubenswrapper[4715]: I1009 08:04:54.344787 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4765-account-create-49j2w" event={"ID":"98e40342-58d1-4d38-9422-ade8716c4a55","Type":"ContainerStarted","Data":"6c08694bd21ed4ff71677c7f4a5e730086e584135f6a3d6a11bf407dbd19acc3"} Oct 09 08:04:54 crc kubenswrapper[4715]: I1009 08:04:54.346974 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-af8d-account-create-4sbml" event={"ID":"d13eea75-c5e8-49f0-99c0-def950b3e0fa","Type":"ContainerStarted","Data":"feaef52a52046c5f6a86f65074c198fc365442cade6012619b300e3742d28638"} Oct 09 08:04:54 crc kubenswrapper[4715]: I1009 08:04:54.347007 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-af8d-account-create-4sbml" event={"ID":"d13eea75-c5e8-49f0-99c0-def950b3e0fa","Type":"ContainerStarted","Data":"86d9f823647e0d94e4ce0d358b83d10f46a8e3b6d12064e77ae592713c4ce996"} Oct 09 08:04:54 crc kubenswrapper[4715]: I1009 08:04:54.396085 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-af8d-account-create-4sbml" podStartSLOduration=1.396061479 podStartE2EDuration="1.396061479s" podCreationTimestamp="2025-10-09 08:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:04:54.384391263 +0000 UTC m=+1125.077195271" watchObservedRunningTime="2025-10-09 08:04:54.396061479 +0000 UTC m=+1125.088865497" Oct 09 08:04:55 crc kubenswrapper[4715]: I1009 08:04:55.358265 4715 generic.go:334] "Generic (PLEG): container finished" podID="d13eea75-c5e8-49f0-99c0-def950b3e0fa" containerID="feaef52a52046c5f6a86f65074c198fc365442cade6012619b300e3742d28638" exitCode=0 Oct 09 08:04:55 crc kubenswrapper[4715]: I1009 08:04:55.358349 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-af8d-account-create-4sbml" event={"ID":"d13eea75-c5e8-49f0-99c0-def950b3e0fa","Type":"ContainerDied","Data":"feaef52a52046c5f6a86f65074c198fc365442cade6012619b300e3742d28638"} Oct 09 08:04:55 crc kubenswrapper[4715]: I1009 08:04:55.840920 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c146-account-create-gbjbj" Oct 09 08:04:55 crc kubenswrapper[4715]: I1009 08:04:55.848818 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4765-account-create-49j2w" Oct 09 08:04:55 crc kubenswrapper[4715]: I1009 08:04:55.966622 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqbsf\" (UniqueName: \"kubernetes.io/projected/98e40342-58d1-4d38-9422-ade8716c4a55-kube-api-access-kqbsf\") pod \"98e40342-58d1-4d38-9422-ade8716c4a55\" (UID: \"98e40342-58d1-4d38-9422-ade8716c4a55\") " Oct 09 08:04:55 crc kubenswrapper[4715]: I1009 08:04:55.967579 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7k8c\" (UniqueName: \"kubernetes.io/projected/9bb3c1ec-082e-4c72-b922-fee185aa0b44-kube-api-access-h7k8c\") pod \"9bb3c1ec-082e-4c72-b922-fee185aa0b44\" (UID: \"9bb3c1ec-082e-4c72-b922-fee185aa0b44\") " Oct 09 08:04:55 crc kubenswrapper[4715]: I1009 08:04:55.973441 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e40342-58d1-4d38-9422-ade8716c4a55-kube-api-access-kqbsf" (OuterVolumeSpecName: "kube-api-access-kqbsf") pod "98e40342-58d1-4d38-9422-ade8716c4a55" (UID: "98e40342-58d1-4d38-9422-ade8716c4a55"). InnerVolumeSpecName "kube-api-access-kqbsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:55 crc kubenswrapper[4715]: I1009 08:04:55.975207 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb3c1ec-082e-4c72-b922-fee185aa0b44-kube-api-access-h7k8c" (OuterVolumeSpecName: "kube-api-access-h7k8c") pod "9bb3c1ec-082e-4c72-b922-fee185aa0b44" (UID: "9bb3c1ec-082e-4c72-b922-fee185aa0b44"). InnerVolumeSpecName "kube-api-access-h7k8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.069817 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7k8c\" (UniqueName: \"kubernetes.io/projected/9bb3c1ec-082e-4c72-b922-fee185aa0b44-kube-api-access-h7k8c\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.069859 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqbsf\" (UniqueName: \"kubernetes.io/projected/98e40342-58d1-4d38-9422-ade8716c4a55-kube-api-access-kqbsf\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.367150 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4765-account-create-49j2w" event={"ID":"98e40342-58d1-4d38-9422-ade8716c4a55","Type":"ContainerDied","Data":"6c08694bd21ed4ff71677c7f4a5e730086e584135f6a3d6a11bf407dbd19acc3"} Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.367211 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c08694bd21ed4ff71677c7f4a5e730086e584135f6a3d6a11bf407dbd19acc3" Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.367172 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4765-account-create-49j2w" Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.368694 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c146-account-create-gbjbj" event={"ID":"9bb3c1ec-082e-4c72-b922-fee185aa0b44","Type":"ContainerDied","Data":"77bd1e4c7a390413f3502e837d8d29f06f2090d13e595cdaf22762fc3dfc9016"} Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.368724 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77bd1e4c7a390413f3502e837d8d29f06f2090d13e595cdaf22762fc3dfc9016" Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.368739 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c146-account-create-gbjbj" Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.735901 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-af8d-account-create-4sbml" Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.845977 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.846282 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="ceilometer-central-agent" containerID="cri-o://95ec11899dee865fe59ab437cbca4d90b99ea1cbf62aa17f97f17734d4fb5d8c" gracePeriod=30 Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.846318 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="sg-core" containerID="cri-o://c5a2370ccee0b27f909f8e4b6e6600314625075b13b85852fa1c16af124449a7" gracePeriod=30 Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.846385 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="ceilometer-notification-agent" containerID="cri-o://6e523cf4d71846d177f25138762646a785958b1251a7d10079053a9bf250a35d" gracePeriod=30 Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.846384 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="proxy-httpd" containerID="cri-o://93c5712f9c3437dd672e38e3d2920d377ebcb0d946734230f0e5cc22302378d8" gracePeriod=30 Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.888113 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vp5t\" (UniqueName: \"kubernetes.io/projected/d13eea75-c5e8-49f0-99c0-def950b3e0fa-kube-api-access-8vp5t\") pod \"d13eea75-c5e8-49f0-99c0-def950b3e0fa\" (UID: \"d13eea75-c5e8-49f0-99c0-def950b3e0fa\") " Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.894858 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13eea75-c5e8-49f0-99c0-def950b3e0fa-kube-api-access-8vp5t" (OuterVolumeSpecName: "kube-api-access-8vp5t") pod "d13eea75-c5e8-49f0-99c0-def950b3e0fa" (UID: "d13eea75-c5e8-49f0-99c0-def950b3e0fa"). InnerVolumeSpecName "kube-api-access-8vp5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:56 crc kubenswrapper[4715]: I1009 08:04:56.990098 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vp5t\" (UniqueName: \"kubernetes.io/projected/d13eea75-c5e8-49f0-99c0-def950b3e0fa-kube-api-access-8vp5t\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.377582 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-af8d-account-create-4sbml" event={"ID":"d13eea75-c5e8-49f0-99c0-def950b3e0fa","Type":"ContainerDied","Data":"86d9f823647e0d94e4ce0d358b83d10f46a8e3b6d12064e77ae592713c4ce996"} Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.377632 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86d9f823647e0d94e4ce0d358b83d10f46a8e3b6d12064e77ae592713c4ce996" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.377651 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-af8d-account-create-4sbml" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.381721 4715 generic.go:334] "Generic (PLEG): container finished" podID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerID="93c5712f9c3437dd672e38e3d2920d377ebcb0d946734230f0e5cc22302378d8" exitCode=0 Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.381747 4715 generic.go:334] "Generic (PLEG): container finished" podID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerID="c5a2370ccee0b27f909f8e4b6e6600314625075b13b85852fa1c16af124449a7" exitCode=2 Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.381756 4715 generic.go:334] "Generic (PLEG): container finished" podID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerID="6e523cf4d71846d177f25138762646a785958b1251a7d10079053a9bf250a35d" exitCode=0 Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.381767 4715 generic.go:334] "Generic (PLEG): container finished" podID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerID="95ec11899dee865fe59ab437cbca4d90b99ea1cbf62aa17f97f17734d4fb5d8c" exitCode=0 Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.381786 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"740413d6-f9d3-42d6-9459-66f79ee018ed","Type":"ContainerDied","Data":"93c5712f9c3437dd672e38e3d2920d377ebcb0d946734230f0e5cc22302378d8"} Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.381812 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"740413d6-f9d3-42d6-9459-66f79ee018ed","Type":"ContainerDied","Data":"c5a2370ccee0b27f909f8e4b6e6600314625075b13b85852fa1c16af124449a7"} Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.381823 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"740413d6-f9d3-42d6-9459-66f79ee018ed","Type":"ContainerDied","Data":"6e523cf4d71846d177f25138762646a785958b1251a7d10079053a9bf250a35d"} Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.381831 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"740413d6-f9d3-42d6-9459-66f79ee018ed","Type":"ContainerDied","Data":"95ec11899dee865fe59ab437cbca4d90b99ea1cbf62aa17f97f17734d4fb5d8c"} Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.508616 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.599389 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-sg-core-conf-yaml\") pod \"740413d6-f9d3-42d6-9459-66f79ee018ed\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.599478 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-combined-ca-bundle\") pod \"740413d6-f9d3-42d6-9459-66f79ee018ed\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.599540 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/740413d6-f9d3-42d6-9459-66f79ee018ed-log-httpd\") pod \"740413d6-f9d3-42d6-9459-66f79ee018ed\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.599596 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/740413d6-f9d3-42d6-9459-66f79ee018ed-run-httpd\") pod \"740413d6-f9d3-42d6-9459-66f79ee018ed\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.599616 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-scripts\") pod \"740413d6-f9d3-42d6-9459-66f79ee018ed\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.599747 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qpqk\" (UniqueName: \"kubernetes.io/projected/740413d6-f9d3-42d6-9459-66f79ee018ed-kube-api-access-4qpqk\") pod \"740413d6-f9d3-42d6-9459-66f79ee018ed\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.599781 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-config-data\") pod \"740413d6-f9d3-42d6-9459-66f79ee018ed\" (UID: \"740413d6-f9d3-42d6-9459-66f79ee018ed\") " Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.600148 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/740413d6-f9d3-42d6-9459-66f79ee018ed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "740413d6-f9d3-42d6-9459-66f79ee018ed" (UID: "740413d6-f9d3-42d6-9459-66f79ee018ed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.600529 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/740413d6-f9d3-42d6-9459-66f79ee018ed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "740413d6-f9d3-42d6-9459-66f79ee018ed" (UID: "740413d6-f9d3-42d6-9459-66f79ee018ed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.604131 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740413d6-f9d3-42d6-9459-66f79ee018ed-kube-api-access-4qpqk" (OuterVolumeSpecName: "kube-api-access-4qpqk") pod "740413d6-f9d3-42d6-9459-66f79ee018ed" (UID: "740413d6-f9d3-42d6-9459-66f79ee018ed"). InnerVolumeSpecName "kube-api-access-4qpqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.605873 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-scripts" (OuterVolumeSpecName: "scripts") pod "740413d6-f9d3-42d6-9459-66f79ee018ed" (UID: "740413d6-f9d3-42d6-9459-66f79ee018ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.624358 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "740413d6-f9d3-42d6-9459-66f79ee018ed" (UID: "740413d6-f9d3-42d6-9459-66f79ee018ed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.679901 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "740413d6-f9d3-42d6-9459-66f79ee018ed" (UID: "740413d6-f9d3-42d6-9459-66f79ee018ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.701201 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qpqk\" (UniqueName: \"kubernetes.io/projected/740413d6-f9d3-42d6-9459-66f79ee018ed-kube-api-access-4qpqk\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.701230 4715 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.701239 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.701247 4715 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/740413d6-f9d3-42d6-9459-66f79ee018ed-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.701256 4715 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/740413d6-f9d3-42d6-9459-66f79ee018ed-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.701264 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.702899 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-config-data" (OuterVolumeSpecName: "config-data") pod "740413d6-f9d3-42d6-9459-66f79ee018ed" (UID: "740413d6-f9d3-42d6-9459-66f79ee018ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:04:57 crc kubenswrapper[4715]: I1009 08:04:57.802742 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/740413d6-f9d3-42d6-9459-66f79ee018ed-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.281604 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4vnss"] Oct 09 08:04:58 crc kubenswrapper[4715]: E1009 08:04:58.282451 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="ceilometer-central-agent" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282476 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="ceilometer-central-agent" Oct 09 08:04:58 crc kubenswrapper[4715]: E1009 08:04:58.282492 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="sg-core" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282501 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="sg-core" Oct 09 08:04:58 crc kubenswrapper[4715]: E1009 08:04:58.282520 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="proxy-httpd" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282528 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="proxy-httpd" Oct 09 08:04:58 crc kubenswrapper[4715]: E1009 08:04:58.282542 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="ceilometer-notification-agent" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282550 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="ceilometer-notification-agent" Oct 09 08:04:58 crc kubenswrapper[4715]: E1009 08:04:58.282567 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb3c1ec-082e-4c72-b922-fee185aa0b44" containerName="mariadb-account-create" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282575 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb3c1ec-082e-4c72-b922-fee185aa0b44" containerName="mariadb-account-create" Oct 09 08:04:58 crc kubenswrapper[4715]: E1009 08:04:58.282602 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13eea75-c5e8-49f0-99c0-def950b3e0fa" containerName="mariadb-account-create" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282612 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13eea75-c5e8-49f0-99c0-def950b3e0fa" containerName="mariadb-account-create" Oct 09 08:04:58 crc kubenswrapper[4715]: E1009 08:04:58.282637 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e40342-58d1-4d38-9422-ade8716c4a55" containerName="mariadb-account-create" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282645 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e40342-58d1-4d38-9422-ade8716c4a55" containerName="mariadb-account-create" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282861 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="ceilometer-notification-agent" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282881 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="sg-core" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282892 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="proxy-httpd" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282902 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e40342-58d1-4d38-9422-ade8716c4a55" containerName="mariadb-account-create" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282915 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13eea75-c5e8-49f0-99c0-def950b3e0fa" containerName="mariadb-account-create" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282928 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb3c1ec-082e-4c72-b922-fee185aa0b44" containerName="mariadb-account-create" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.282945 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" containerName="ceilometer-central-agent" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.283711 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.285933 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.290443 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.290449 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j5s75" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.291246 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4vnss"] Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.392445 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"740413d6-f9d3-42d6-9459-66f79ee018ed","Type":"ContainerDied","Data":"a0ce0ea835c0a7ec86e12e51f2cbc5d2a460fa7579f21d4c9b643c56ec05fa17"} Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.392496 4715 scope.go:117] "RemoveContainer" containerID="93c5712f9c3437dd672e38e3d2920d377ebcb0d946734230f0e5cc22302378d8" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.392671 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.423127 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.425306 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz4qj\" (UniqueName: \"kubernetes.io/projected/9997aadd-2c07-463c-b694-657dbd229eaf-kube-api-access-nz4qj\") pod \"nova-cell0-conductor-db-sync-4vnss\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.425390 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-scripts\") pod \"nova-cell0-conductor-db-sync-4vnss\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.425718 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-config-data\") pod \"nova-cell0-conductor-db-sync-4vnss\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.425798 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4vnss\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.430430 4715 scope.go:117] "RemoveContainer" containerID="c5a2370ccee0b27f909f8e4b6e6600314625075b13b85852fa1c16af124449a7" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.454134 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.460225 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.478008 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.478194 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.484700 4715 scope.go:117] "RemoveContainer" containerID="6e523cf4d71846d177f25138762646a785958b1251a7d10079053a9bf250a35d" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.493606 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.494086 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.527716 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-config-data\") pod \"nova-cell0-conductor-db-sync-4vnss\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.527794 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4vnss\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.527866 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz4qj\" (UniqueName: \"kubernetes.io/projected/9997aadd-2c07-463c-b694-657dbd229eaf-kube-api-access-nz4qj\") pod \"nova-cell0-conductor-db-sync-4vnss\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.527891 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-scripts\") pod \"nova-cell0-conductor-db-sync-4vnss\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.534802 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-config-data\") pod \"nova-cell0-conductor-db-sync-4vnss\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.540745 4715 scope.go:117] "RemoveContainer" containerID="95ec11899dee865fe59ab437cbca4d90b99ea1cbf62aa17f97f17734d4fb5d8c" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.548716 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4vnss\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.550330 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz4qj\" (UniqueName: \"kubernetes.io/projected/9997aadd-2c07-463c-b694-657dbd229eaf-kube-api-access-nz4qj\") pod \"nova-cell0-conductor-db-sync-4vnss\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.559948 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-scripts\") pod \"nova-cell0-conductor-db-sync-4vnss\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.601392 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.629971 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.630497 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.630652 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-scripts\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.630811 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-config-data\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.630948 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/406e4641-28f0-4f4e-80f0-6c98f0722560-log-httpd\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.631044 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47qw6\" (UniqueName: \"kubernetes.io/projected/406e4641-28f0-4f4e-80f0-6c98f0722560-kube-api-access-47qw6\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.631590 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/406e4641-28f0-4f4e-80f0-6c98f0722560-run-httpd\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.733100 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/406e4641-28f0-4f4e-80f0-6c98f0722560-run-httpd\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.733964 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.734040 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.734087 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-scripts\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.734132 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-config-data\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.734202 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/406e4641-28f0-4f4e-80f0-6c98f0722560-log-httpd\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.734231 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47qw6\" (UniqueName: \"kubernetes.io/projected/406e4641-28f0-4f4e-80f0-6c98f0722560-kube-api-access-47qw6\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.736729 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/406e4641-28f0-4f4e-80f0-6c98f0722560-run-httpd\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.738324 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/406e4641-28f0-4f4e-80f0-6c98f0722560-log-httpd\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.742775 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-config-data\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.744017 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-scripts\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.746528 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.748265 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.754649 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47qw6\" (UniqueName: \"kubernetes.io/projected/406e4641-28f0-4f4e-80f0-6c98f0722560-kube-api-access-47qw6\") pod \"ceilometer-0\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " pod="openstack/ceilometer-0" Oct 09 08:04:58 crc kubenswrapper[4715]: I1009 08:04:58.814992 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:04:59 crc kubenswrapper[4715]: I1009 08:04:59.066936 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4vnss"] Oct 09 08:04:59 crc kubenswrapper[4715]: I1009 08:04:59.258358 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:04:59 crc kubenswrapper[4715]: I1009 08:04:59.402813 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"406e4641-28f0-4f4e-80f0-6c98f0722560","Type":"ContainerStarted","Data":"ed7de251d25c005ab9f97ff0933f6fe8c4ec4f99235a16d6ddd82d67cb451bee"} Oct 09 08:04:59 crc kubenswrapper[4715]: I1009 08:04:59.404011 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4vnss" event={"ID":"9997aadd-2c07-463c-b694-657dbd229eaf","Type":"ContainerStarted","Data":"3df6643151f7dd7153cfe5f3bc4d64ad91dd96020a4961c0a2bf0ed02ee14566"} Oct 09 08:05:00 crc kubenswrapper[4715]: I1009 08:05:00.150125 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740413d6-f9d3-42d6-9459-66f79ee018ed" path="/var/lib/kubelet/pods/740413d6-f9d3-42d6-9459-66f79ee018ed/volumes" Oct 09 08:05:00 crc kubenswrapper[4715]: I1009 08:05:00.430286 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"406e4641-28f0-4f4e-80f0-6c98f0722560","Type":"ContainerStarted","Data":"b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311"} Oct 09 08:05:01 crc kubenswrapper[4715]: I1009 08:05:01.443331 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"406e4641-28f0-4f4e-80f0-6c98f0722560","Type":"ContainerStarted","Data":"2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3"} Oct 09 08:05:06 crc kubenswrapper[4715]: I1009 08:05:06.492708 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"406e4641-28f0-4f4e-80f0-6c98f0722560","Type":"ContainerStarted","Data":"442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf"} Oct 09 08:05:06 crc kubenswrapper[4715]: I1009 08:05:06.496002 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4vnss" event={"ID":"9997aadd-2c07-463c-b694-657dbd229eaf","Type":"ContainerStarted","Data":"d6866cc7a0c40e97ff268190f7d249a154e01b5a63f5ef7e0c4ebd812f1cff2f"} Oct 09 08:05:06 crc kubenswrapper[4715]: I1009 08:05:06.518659 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-4vnss" podStartSLOduration=2.086795557 podStartE2EDuration="8.518609876s" podCreationTimestamp="2025-10-09 08:04:58 +0000 UTC" firstStartedPulling="2025-10-09 08:04:59.082516124 +0000 UTC m=+1129.775320132" lastFinishedPulling="2025-10-09 08:05:05.514330433 +0000 UTC m=+1136.207134451" observedRunningTime="2025-10-09 08:05:06.511036268 +0000 UTC m=+1137.203840276" watchObservedRunningTime="2025-10-09 08:05:06.518609876 +0000 UTC m=+1137.211413884" Oct 09 08:05:07 crc kubenswrapper[4715]: I1009 08:05:07.507041 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"406e4641-28f0-4f4e-80f0-6c98f0722560","Type":"ContainerStarted","Data":"ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865"} Oct 09 08:05:07 crc kubenswrapper[4715]: I1009 08:05:07.507430 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 08:05:07 crc kubenswrapper[4715]: I1009 08:05:07.539098 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.868578223 podStartE2EDuration="9.539081736s" podCreationTimestamp="2025-10-09 08:04:58 +0000 UTC" firstStartedPulling="2025-10-09 08:04:59.268832657 +0000 UTC m=+1129.961636665" lastFinishedPulling="2025-10-09 08:05:06.93933617 +0000 UTC m=+1137.632140178" observedRunningTime="2025-10-09 08:05:07.534291858 +0000 UTC m=+1138.227095876" watchObservedRunningTime="2025-10-09 08:05:07.539081736 +0000 UTC m=+1138.231885744" Oct 09 08:05:08 crc kubenswrapper[4715]: I1009 08:05:08.199957 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:09 crc kubenswrapper[4715]: I1009 08:05:09.526285 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="ceilometer-central-agent" containerID="cri-o://b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311" gracePeriod=30 Oct 09 08:05:09 crc kubenswrapper[4715]: I1009 08:05:09.526345 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="sg-core" containerID="cri-o://442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf" gracePeriod=30 Oct 09 08:05:09 crc kubenswrapper[4715]: I1009 08:05:09.526374 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="ceilometer-notification-agent" containerID="cri-o://2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3" gracePeriod=30 Oct 09 08:05:09 crc kubenswrapper[4715]: I1009 08:05:09.526397 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="proxy-httpd" containerID="cri-o://ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865" gracePeriod=30 Oct 09 08:05:10 crc kubenswrapper[4715]: I1009 08:05:10.537679 4715 generic.go:334] "Generic (PLEG): container finished" podID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerID="ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865" exitCode=0 Oct 09 08:05:10 crc kubenswrapper[4715]: I1009 08:05:10.538025 4715 generic.go:334] "Generic (PLEG): container finished" podID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerID="442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf" exitCode=2 Oct 09 08:05:10 crc kubenswrapper[4715]: I1009 08:05:10.538038 4715 generic.go:334] "Generic (PLEG): container finished" podID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerID="b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311" exitCode=0 Oct 09 08:05:10 crc kubenswrapper[4715]: I1009 08:05:10.538989 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"406e4641-28f0-4f4e-80f0-6c98f0722560","Type":"ContainerDied","Data":"ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865"} Oct 09 08:05:10 crc kubenswrapper[4715]: I1009 08:05:10.539029 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"406e4641-28f0-4f4e-80f0-6c98f0722560","Type":"ContainerDied","Data":"442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf"} Oct 09 08:05:10 crc kubenswrapper[4715]: I1009 08:05:10.539042 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"406e4641-28f0-4f4e-80f0-6c98f0722560","Type":"ContainerDied","Data":"b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311"} Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.085078 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.183465 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/406e4641-28f0-4f4e-80f0-6c98f0722560-log-httpd\") pod \"406e4641-28f0-4f4e-80f0-6c98f0722560\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.183612 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/406e4641-28f0-4f4e-80f0-6c98f0722560-run-httpd\") pod \"406e4641-28f0-4f4e-80f0-6c98f0722560\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.183667 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47qw6\" (UniqueName: \"kubernetes.io/projected/406e4641-28f0-4f4e-80f0-6c98f0722560-kube-api-access-47qw6\") pod \"406e4641-28f0-4f4e-80f0-6c98f0722560\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.183792 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-scripts\") pod \"406e4641-28f0-4f4e-80f0-6c98f0722560\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.183836 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-config-data\") pod \"406e4641-28f0-4f4e-80f0-6c98f0722560\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.184285 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/406e4641-28f0-4f4e-80f0-6c98f0722560-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "406e4641-28f0-4f4e-80f0-6c98f0722560" (UID: "406e4641-28f0-4f4e-80f0-6c98f0722560"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.184340 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/406e4641-28f0-4f4e-80f0-6c98f0722560-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "406e4641-28f0-4f4e-80f0-6c98f0722560" (UID: "406e4641-28f0-4f4e-80f0-6c98f0722560"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.184669 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-sg-core-conf-yaml\") pod \"406e4641-28f0-4f4e-80f0-6c98f0722560\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.184798 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-combined-ca-bundle\") pod \"406e4641-28f0-4f4e-80f0-6c98f0722560\" (UID: \"406e4641-28f0-4f4e-80f0-6c98f0722560\") " Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.185259 4715 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/406e4641-28f0-4f4e-80f0-6c98f0722560-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.185281 4715 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/406e4641-28f0-4f4e-80f0-6c98f0722560-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.189218 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-scripts" (OuterVolumeSpecName: "scripts") pod "406e4641-28f0-4f4e-80f0-6c98f0722560" (UID: "406e4641-28f0-4f4e-80f0-6c98f0722560"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.189387 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/406e4641-28f0-4f4e-80f0-6c98f0722560-kube-api-access-47qw6" (OuterVolumeSpecName: "kube-api-access-47qw6") pod "406e4641-28f0-4f4e-80f0-6c98f0722560" (UID: "406e4641-28f0-4f4e-80f0-6c98f0722560"). InnerVolumeSpecName "kube-api-access-47qw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.223906 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "406e4641-28f0-4f4e-80f0-6c98f0722560" (UID: "406e4641-28f0-4f4e-80f0-6c98f0722560"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.271317 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "406e4641-28f0-4f4e-80f0-6c98f0722560" (UID: "406e4641-28f0-4f4e-80f0-6c98f0722560"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.287627 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.287674 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47qw6\" (UniqueName: \"kubernetes.io/projected/406e4641-28f0-4f4e-80f0-6c98f0722560-kube-api-access-47qw6\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.287690 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.287701 4715 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.317451 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-config-data" (OuterVolumeSpecName: "config-data") pod "406e4641-28f0-4f4e-80f0-6c98f0722560" (UID: "406e4641-28f0-4f4e-80f0-6c98f0722560"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.388773 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/406e4641-28f0-4f4e-80f0-6c98f0722560-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.550460 4715 generic.go:334] "Generic (PLEG): container finished" podID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerID="2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3" exitCode=0 Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.550517 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"406e4641-28f0-4f4e-80f0-6c98f0722560","Type":"ContainerDied","Data":"2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3"} Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.550549 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"406e4641-28f0-4f4e-80f0-6c98f0722560","Type":"ContainerDied","Data":"ed7de251d25c005ab9f97ff0933f6fe8c4ec4f99235a16d6ddd82d67cb451bee"} Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.550572 4715 scope.go:117] "RemoveContainer" containerID="ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.550725 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.583927 4715 scope.go:117] "RemoveContainer" containerID="442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.586022 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.599046 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.608900 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:11 crc kubenswrapper[4715]: E1009 08:05:11.609342 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="ceilometer-central-agent" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.609368 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="ceilometer-central-agent" Oct 09 08:05:11 crc kubenswrapper[4715]: E1009 08:05:11.609407 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="sg-core" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.609415 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="sg-core" Oct 09 08:05:11 crc kubenswrapper[4715]: E1009 08:05:11.609450 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="proxy-httpd" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.609459 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="proxy-httpd" Oct 09 08:05:11 crc kubenswrapper[4715]: E1009 08:05:11.609474 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="ceilometer-notification-agent" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.609481 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="ceilometer-notification-agent" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.609684 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="sg-core" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.609705 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="ceilometer-central-agent" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.609728 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="proxy-httpd" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.609750 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" containerName="ceilometer-notification-agent" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.610035 4715 scope.go:117] "RemoveContainer" containerID="2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.611864 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.615032 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.618226 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.622504 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.656543 4715 scope.go:117] "RemoveContainer" containerID="b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.678466 4715 scope.go:117] "RemoveContainer" containerID="ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865" Oct 09 08:05:11 crc kubenswrapper[4715]: E1009 08:05:11.678972 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865\": container with ID starting with ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865 not found: ID does not exist" containerID="ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.679738 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865"} err="failed to get container status \"ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865\": rpc error: code = NotFound desc = could not find container \"ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865\": container with ID starting with ccc82b021da4e5b818960f5c9a5ded0c1ecad573ae5c65aa9389cb615eb23865 not found: ID does not exist" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.679800 4715 scope.go:117] "RemoveContainer" containerID="442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf" Oct 09 08:05:11 crc kubenswrapper[4715]: E1009 08:05:11.680280 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf\": container with ID starting with 442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf not found: ID does not exist" containerID="442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.680335 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf"} err="failed to get container status \"442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf\": rpc error: code = NotFound desc = could not find container \"442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf\": container with ID starting with 442e9f806f52652188461db6dfb3808acbc022ae0d66b563088e35f6887747bf not found: ID does not exist" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.680368 4715 scope.go:117] "RemoveContainer" containerID="2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3" Oct 09 08:05:11 crc kubenswrapper[4715]: E1009 08:05:11.680716 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3\": container with ID starting with 2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3 not found: ID does not exist" containerID="2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.680757 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3"} err="failed to get container status \"2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3\": rpc error: code = NotFound desc = could not find container \"2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3\": container with ID starting with 2c6fc04b13f3eaab8776577a52760b1e8059fdac9a8744484a92c627cba009a3 not found: ID does not exist" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.680773 4715 scope.go:117] "RemoveContainer" containerID="b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311" Oct 09 08:05:11 crc kubenswrapper[4715]: E1009 08:05:11.681049 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311\": container with ID starting with b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311 not found: ID does not exist" containerID="b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.681081 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311"} err="failed to get container status \"b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311\": rpc error: code = NotFound desc = could not find container \"b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311\": container with ID starting with b948506f023c51bca2a2b8e4d04eb0f5ae8103cd7b657406d48a8180d94f9311 not found: ID does not exist" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.695587 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-scripts\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.695694 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.695717 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.695742 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf5jm\" (UniqueName: \"kubernetes.io/projected/4a0601dd-993e-4be7-85cf-c087210d5d6b-kube-api-access-mf5jm\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.695780 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-config-data\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.695795 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a0601dd-993e-4be7-85cf-c087210d5d6b-run-httpd\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.695811 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a0601dd-993e-4be7-85cf-c087210d5d6b-log-httpd\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.797987 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-config-data\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.798076 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a0601dd-993e-4be7-85cf-c087210d5d6b-run-httpd\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.798127 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a0601dd-993e-4be7-85cf-c087210d5d6b-log-httpd\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.798289 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-scripts\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.798523 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.798592 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.798642 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf5jm\" (UniqueName: \"kubernetes.io/projected/4a0601dd-993e-4be7-85cf-c087210d5d6b-kube-api-access-mf5jm\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.798896 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a0601dd-993e-4be7-85cf-c087210d5d6b-log-httpd\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.798966 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a0601dd-993e-4be7-85cf-c087210d5d6b-run-httpd\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.803105 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-scripts\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.804009 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.804178 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-config-data\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.816096 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.818982 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf5jm\" (UniqueName: \"kubernetes.io/projected/4a0601dd-993e-4be7-85cf-c087210d5d6b-kube-api-access-mf5jm\") pod \"ceilometer-0\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.945571 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:05:11 crc kubenswrapper[4715]: I1009 08:05:11.969064 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:12 crc kubenswrapper[4715]: I1009 08:05:12.147765 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="406e4641-28f0-4f4e-80f0-6c98f0722560" path="/var/lib/kubelet/pods/406e4641-28f0-4f4e-80f0-6c98f0722560/volumes" Oct 09 08:05:12 crc kubenswrapper[4715]: I1009 08:05:12.371796 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:12 crc kubenswrapper[4715]: I1009 08:05:12.564740 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a0601dd-993e-4be7-85cf-c087210d5d6b","Type":"ContainerStarted","Data":"82ec9849edb785d965c13e1063940a115f921d04c94b90bac05cd2e451236f70"} Oct 09 08:05:13 crc kubenswrapper[4715]: I1009 08:05:13.575066 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a0601dd-993e-4be7-85cf-c087210d5d6b","Type":"ContainerStarted","Data":"64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34"} Oct 09 08:05:14 crc kubenswrapper[4715]: I1009 08:05:14.590646 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a0601dd-993e-4be7-85cf-c087210d5d6b","Type":"ContainerStarted","Data":"1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4"} Oct 09 08:05:16 crc kubenswrapper[4715]: I1009 08:05:16.675907 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a0601dd-993e-4be7-85cf-c087210d5d6b","Type":"ContainerStarted","Data":"d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203"} Oct 09 08:05:17 crc kubenswrapper[4715]: I1009 08:05:17.691195 4715 generic.go:334] "Generic (PLEG): container finished" podID="9997aadd-2c07-463c-b694-657dbd229eaf" containerID="d6866cc7a0c40e97ff268190f7d249a154e01b5a63f5ef7e0c4ebd812f1cff2f" exitCode=0 Oct 09 08:05:17 crc kubenswrapper[4715]: I1009 08:05:17.691620 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4vnss" event={"ID":"9997aadd-2c07-463c-b694-657dbd229eaf","Type":"ContainerDied","Data":"d6866cc7a0c40e97ff268190f7d249a154e01b5a63f5ef7e0c4ebd812f1cff2f"} Oct 09 08:05:17 crc kubenswrapper[4715]: I1009 08:05:17.697484 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a0601dd-993e-4be7-85cf-c087210d5d6b","Type":"ContainerStarted","Data":"a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3"} Oct 09 08:05:17 crc kubenswrapper[4715]: I1009 08:05:17.697689 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="ceilometer-central-agent" containerID="cri-o://64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34" gracePeriod=30 Oct 09 08:05:17 crc kubenswrapper[4715]: I1009 08:05:17.697923 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="sg-core" containerID="cri-o://d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203" gracePeriod=30 Oct 09 08:05:17 crc kubenswrapper[4715]: I1009 08:05:17.698104 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="proxy-httpd" containerID="cri-o://a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3" gracePeriod=30 Oct 09 08:05:17 crc kubenswrapper[4715]: I1009 08:05:17.698102 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="ceilometer-notification-agent" containerID="cri-o://1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4" gracePeriod=30 Oct 09 08:05:17 crc kubenswrapper[4715]: I1009 08:05:17.698163 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 08:05:17 crc kubenswrapper[4715]: I1009 08:05:17.748672 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.557503923 podStartE2EDuration="6.748651233s" podCreationTimestamp="2025-10-09 08:05:11 +0000 UTC" firstStartedPulling="2025-10-09 08:05:12.380120978 +0000 UTC m=+1143.072924986" lastFinishedPulling="2025-10-09 08:05:16.571268288 +0000 UTC m=+1147.264072296" observedRunningTime="2025-10-09 08:05:17.738101339 +0000 UTC m=+1148.430905367" watchObservedRunningTime="2025-10-09 08:05:17.748651233 +0000 UTC m=+1148.441455251" Oct 09 08:05:18 crc kubenswrapper[4715]: I1009 08:05:18.707381 4715 generic.go:334] "Generic (PLEG): container finished" podID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerID="a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3" exitCode=0 Oct 09 08:05:18 crc kubenswrapper[4715]: I1009 08:05:18.707704 4715 generic.go:334] "Generic (PLEG): container finished" podID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerID="d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203" exitCode=2 Oct 09 08:05:18 crc kubenswrapper[4715]: I1009 08:05:18.707457 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a0601dd-993e-4be7-85cf-c087210d5d6b","Type":"ContainerDied","Data":"a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3"} Oct 09 08:05:18 crc kubenswrapper[4715]: I1009 08:05:18.707755 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a0601dd-993e-4be7-85cf-c087210d5d6b","Type":"ContainerDied","Data":"d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203"} Oct 09 08:05:18 crc kubenswrapper[4715]: I1009 08:05:18.707770 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a0601dd-993e-4be7-85cf-c087210d5d6b","Type":"ContainerDied","Data":"1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4"} Oct 09 08:05:18 crc kubenswrapper[4715]: I1009 08:05:18.707716 4715 generic.go:334] "Generic (PLEG): container finished" podID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerID="1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4" exitCode=0 Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.035697 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.139822 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-scripts\") pod \"9997aadd-2c07-463c-b694-657dbd229eaf\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.139920 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-combined-ca-bundle\") pod \"9997aadd-2c07-463c-b694-657dbd229eaf\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.139955 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz4qj\" (UniqueName: \"kubernetes.io/projected/9997aadd-2c07-463c-b694-657dbd229eaf-kube-api-access-nz4qj\") pod \"9997aadd-2c07-463c-b694-657dbd229eaf\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.140022 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-config-data\") pod \"9997aadd-2c07-463c-b694-657dbd229eaf\" (UID: \"9997aadd-2c07-463c-b694-657dbd229eaf\") " Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.145757 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9997aadd-2c07-463c-b694-657dbd229eaf-kube-api-access-nz4qj" (OuterVolumeSpecName: "kube-api-access-nz4qj") pod "9997aadd-2c07-463c-b694-657dbd229eaf" (UID: "9997aadd-2c07-463c-b694-657dbd229eaf"). InnerVolumeSpecName "kube-api-access-nz4qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.146684 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-scripts" (OuterVolumeSpecName: "scripts") pod "9997aadd-2c07-463c-b694-657dbd229eaf" (UID: "9997aadd-2c07-463c-b694-657dbd229eaf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.167565 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9997aadd-2c07-463c-b694-657dbd229eaf" (UID: "9997aadd-2c07-463c-b694-657dbd229eaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.184159 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-config-data" (OuterVolumeSpecName: "config-data") pod "9997aadd-2c07-463c-b694-657dbd229eaf" (UID: "9997aadd-2c07-463c-b694-657dbd229eaf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.243206 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.243329 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.243353 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9997aadd-2c07-463c-b694-657dbd229eaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.243402 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz4qj\" (UniqueName: \"kubernetes.io/projected/9997aadd-2c07-463c-b694-657dbd229eaf-kube-api-access-nz4qj\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.460511 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.549867 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-combined-ca-bundle\") pod \"4a0601dd-993e-4be7-85cf-c087210d5d6b\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.549926 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a0601dd-993e-4be7-85cf-c087210d5d6b-run-httpd\") pod \"4a0601dd-993e-4be7-85cf-c087210d5d6b\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.549955 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf5jm\" (UniqueName: \"kubernetes.io/projected/4a0601dd-993e-4be7-85cf-c087210d5d6b-kube-api-access-mf5jm\") pod \"4a0601dd-993e-4be7-85cf-c087210d5d6b\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.550015 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-config-data\") pod \"4a0601dd-993e-4be7-85cf-c087210d5d6b\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.550065 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-sg-core-conf-yaml\") pod \"4a0601dd-993e-4be7-85cf-c087210d5d6b\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.550138 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-scripts\") pod \"4a0601dd-993e-4be7-85cf-c087210d5d6b\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.550170 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a0601dd-993e-4be7-85cf-c087210d5d6b-log-httpd\") pod \"4a0601dd-993e-4be7-85cf-c087210d5d6b\" (UID: \"4a0601dd-993e-4be7-85cf-c087210d5d6b\") " Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.550617 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a0601dd-993e-4be7-85cf-c087210d5d6b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4a0601dd-993e-4be7-85cf-c087210d5d6b" (UID: "4a0601dd-993e-4be7-85cf-c087210d5d6b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.551005 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a0601dd-993e-4be7-85cf-c087210d5d6b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4a0601dd-993e-4be7-85cf-c087210d5d6b" (UID: "4a0601dd-993e-4be7-85cf-c087210d5d6b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.554631 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-scripts" (OuterVolumeSpecName: "scripts") pod "4a0601dd-993e-4be7-85cf-c087210d5d6b" (UID: "4a0601dd-993e-4be7-85cf-c087210d5d6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.556033 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a0601dd-993e-4be7-85cf-c087210d5d6b-kube-api-access-mf5jm" (OuterVolumeSpecName: "kube-api-access-mf5jm") pod "4a0601dd-993e-4be7-85cf-c087210d5d6b" (UID: "4a0601dd-993e-4be7-85cf-c087210d5d6b"). InnerVolumeSpecName "kube-api-access-mf5jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.575261 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4a0601dd-993e-4be7-85cf-c087210d5d6b" (UID: "4a0601dd-993e-4be7-85cf-c087210d5d6b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.625208 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a0601dd-993e-4be7-85cf-c087210d5d6b" (UID: "4a0601dd-993e-4be7-85cf-c087210d5d6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.640605 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-config-data" (OuterVolumeSpecName: "config-data") pod "4a0601dd-993e-4be7-85cf-c087210d5d6b" (UID: "4a0601dd-993e-4be7-85cf-c087210d5d6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.651700 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.651727 4715 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a0601dd-993e-4be7-85cf-c087210d5d6b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.651736 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.651747 4715 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4a0601dd-993e-4be7-85cf-c087210d5d6b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.651755 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf5jm\" (UniqueName: \"kubernetes.io/projected/4a0601dd-993e-4be7-85cf-c087210d5d6b-kube-api-access-mf5jm\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.651765 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.651774 4715 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4a0601dd-993e-4be7-85cf-c087210d5d6b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.720020 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4vnss" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.720057 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4vnss" event={"ID":"9997aadd-2c07-463c-b694-657dbd229eaf","Type":"ContainerDied","Data":"3df6643151f7dd7153cfe5f3bc4d64ad91dd96020a4961c0a2bf0ed02ee14566"} Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.720098 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3df6643151f7dd7153cfe5f3bc4d64ad91dd96020a4961c0a2bf0ed02ee14566" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.724910 4715 generic.go:334] "Generic (PLEG): container finished" podID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerID="64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34" exitCode=0 Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.724980 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a0601dd-993e-4be7-85cf-c087210d5d6b","Type":"ContainerDied","Data":"64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34"} Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.725035 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4a0601dd-993e-4be7-85cf-c087210d5d6b","Type":"ContainerDied","Data":"82ec9849edb785d965c13e1063940a115f921d04c94b90bac05cd2e451236f70"} Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.725061 4715 scope.go:117] "RemoveContainer" containerID="a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.726565 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.754963 4715 scope.go:117] "RemoveContainer" containerID="d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.780567 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.781894 4715 scope.go:117] "RemoveContainer" containerID="1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.792906 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.805159 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:19 crc kubenswrapper[4715]: E1009 08:05:19.805693 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9997aadd-2c07-463c-b694-657dbd229eaf" containerName="nova-cell0-conductor-db-sync" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.805717 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9997aadd-2c07-463c-b694-657dbd229eaf" containerName="nova-cell0-conductor-db-sync" Oct 09 08:05:19 crc kubenswrapper[4715]: E1009 08:05:19.805737 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="sg-core" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.805745 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="sg-core" Oct 09 08:05:19 crc kubenswrapper[4715]: E1009 08:05:19.805758 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="ceilometer-central-agent" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.805776 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="ceilometer-central-agent" Oct 09 08:05:19 crc kubenswrapper[4715]: E1009 08:05:19.805806 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="ceilometer-notification-agent" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.805814 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="ceilometer-notification-agent" Oct 09 08:05:19 crc kubenswrapper[4715]: E1009 08:05:19.805830 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="proxy-httpd" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.805837 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="proxy-httpd" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.806052 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="ceilometer-central-agent" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.806073 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="proxy-httpd" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.806088 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="sg-core" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.806103 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" containerName="ceilometer-notification-agent" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.806132 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9997aadd-2c07-463c-b694-657dbd229eaf" containerName="nova-cell0-conductor-db-sync" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.808208 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.812147 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.812353 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.813744 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.822755 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.824115 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.827065 4715 scope.go:117] "RemoveContainer" containerID="64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.827395 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.827451 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j5s75" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.865963 4715 scope.go:117] "RemoveContainer" containerID="a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.866742 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b364ad0f-f451-4668-b657-23e9128a0b5f-run-httpd\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: E1009 08:05:19.866851 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3\": container with ID starting with a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3 not found: ID does not exist" containerID="a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.866895 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3"} err="failed to get container status \"a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3\": rpc error: code = NotFound desc = could not find container \"a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3\": container with ID starting with a0654435ad23c04a738142f680e40a827848201ac9c1bc1beef90be5f19da6d3 not found: ID does not exist" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.866928 4715 scope.go:117] "RemoveContainer" containerID="d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.868880 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-config-data\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.868994 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b364ad0f-f451-4668-b657-23e9128a0b5f-log-httpd\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.869238 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwmfg\" (UniqueName: \"kubernetes.io/projected/b364ad0f-f451-4668-b657-23e9128a0b5f-kube-api-access-pwmfg\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.869338 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.869434 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.869496 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.869627 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-scripts\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: E1009 08:05:19.871193 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203\": container with ID starting with d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203 not found: ID does not exist" containerID="d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.871239 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203"} err="failed to get container status \"d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203\": rpc error: code = NotFound desc = could not find container \"d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203\": container with ID starting with d930aa1f671eaf554348eb561b6b84288570e777ca1b4d2bb75a1534703b3203 not found: ID does not exist" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.871264 4715 scope.go:117] "RemoveContainer" containerID="1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4" Oct 09 08:05:19 crc kubenswrapper[4715]: E1009 08:05:19.871781 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4\": container with ID starting with 1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4 not found: ID does not exist" containerID="1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.871832 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4"} err="failed to get container status \"1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4\": rpc error: code = NotFound desc = could not find container \"1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4\": container with ID starting with 1228d56a8226f1b7d671e2816b59186792c606d7cbef7646fc947ae9e76214f4 not found: ID does not exist" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.871860 4715 scope.go:117] "RemoveContainer" containerID="64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34" Oct 09 08:05:19 crc kubenswrapper[4715]: E1009 08:05:19.872334 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34\": container with ID starting with 64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34 not found: ID does not exist" containerID="64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.872363 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34"} err="failed to get container status \"64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34\": rpc error: code = NotFound desc = could not find container \"64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34\": container with ID starting with 64333a17e24ffb5c7b0a6241db604c865a530c6a4cb2212bb19fe3625e5d1f34 not found: ID does not exist" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.971803 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwmfg\" (UniqueName: \"kubernetes.io/projected/b364ad0f-f451-4668-b657-23e9128a0b5f-kube-api-access-pwmfg\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.971893 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.971930 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f42f3a-98f5-442f-a169-3d7080e5fea3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b0f42f3a-98f5-442f-a169-3d7080e5fea3\") " pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.971966 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.972000 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc4g2\" (UniqueName: \"kubernetes.io/projected/b0f42f3a-98f5-442f-a169-3d7080e5fea3-kube-api-access-xc4g2\") pod \"nova-cell0-conductor-0\" (UID: \"b0f42f3a-98f5-442f-a169-3d7080e5fea3\") " pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.972061 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-scripts\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.972117 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b364ad0f-f451-4668-b657-23e9128a0b5f-run-httpd\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.972136 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-config-data\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.972168 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b364ad0f-f451-4668-b657-23e9128a0b5f-log-httpd\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.972268 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f42f3a-98f5-442f-a169-3d7080e5fea3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b0f42f3a-98f5-442f-a169-3d7080e5fea3\") " pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.972919 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b364ad0f-f451-4668-b657-23e9128a0b5f-log-httpd\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.973190 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b364ad0f-f451-4668-b657-23e9128a0b5f-run-httpd\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.976651 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.979709 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.980178 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-config-data\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.980445 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-scripts\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:19 crc kubenswrapper[4715]: I1009 08:05:19.992083 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwmfg\" (UniqueName: \"kubernetes.io/projected/b364ad0f-f451-4668-b657-23e9128a0b5f-kube-api-access-pwmfg\") pod \"ceilometer-0\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " pod="openstack/ceilometer-0" Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.074489 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f42f3a-98f5-442f-a169-3d7080e5fea3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b0f42f3a-98f5-442f-a169-3d7080e5fea3\") " pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.074601 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc4g2\" (UniqueName: \"kubernetes.io/projected/b0f42f3a-98f5-442f-a169-3d7080e5fea3-kube-api-access-xc4g2\") pod \"nova-cell0-conductor-0\" (UID: \"b0f42f3a-98f5-442f-a169-3d7080e5fea3\") " pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.074779 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f42f3a-98f5-442f-a169-3d7080e5fea3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b0f42f3a-98f5-442f-a169-3d7080e5fea3\") " pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.078184 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f42f3a-98f5-442f-a169-3d7080e5fea3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b0f42f3a-98f5-442f-a169-3d7080e5fea3\") " pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.079988 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0f42f3a-98f5-442f-a169-3d7080e5fea3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b0f42f3a-98f5-442f-a169-3d7080e5fea3\") " pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.103240 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc4g2\" (UniqueName: \"kubernetes.io/projected/b0f42f3a-98f5-442f-a169-3d7080e5fea3-kube-api-access-xc4g2\") pod \"nova-cell0-conductor-0\" (UID: \"b0f42f3a-98f5-442f-a169-3d7080e5fea3\") " pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.128684 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.150918 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a0601dd-993e-4be7-85cf-c087210d5d6b" path="/var/lib/kubelet/pods/4a0601dd-993e-4be7-85cf-c087210d5d6b/volumes" Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.161771 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.584065 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:20 crc kubenswrapper[4715]: W1009 08:05:20.584612 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb364ad0f_f451_4668_b657_23e9128a0b5f.slice/crio-db39cf168ce987b5aa1dfcd5f51803b5deb2bcb5ff109b4f64888473fa7c959a WatchSource:0}: Error finding container db39cf168ce987b5aa1dfcd5f51803b5deb2bcb5ff109b4f64888473fa7c959a: Status 404 returned error can't find the container with id db39cf168ce987b5aa1dfcd5f51803b5deb2bcb5ff109b4f64888473fa7c959a Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.643652 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 09 08:05:20 crc kubenswrapper[4715]: W1009 08:05:20.645167 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0f42f3a_98f5_442f_a169_3d7080e5fea3.slice/crio-1635c60738668606e0d52312c2461495baea58ffdef53643217c1d9ab8b5c007 WatchSource:0}: Error finding container 1635c60738668606e0d52312c2461495baea58ffdef53643217c1d9ab8b5c007: Status 404 returned error can't find the container with id 1635c60738668606e0d52312c2461495baea58ffdef53643217c1d9ab8b5c007 Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.735329 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b364ad0f-f451-4668-b657-23e9128a0b5f","Type":"ContainerStarted","Data":"db39cf168ce987b5aa1dfcd5f51803b5deb2bcb5ff109b4f64888473fa7c959a"} Oct 09 08:05:20 crc kubenswrapper[4715]: I1009 08:05:20.737294 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b0f42f3a-98f5-442f-a169-3d7080e5fea3","Type":"ContainerStarted","Data":"1635c60738668606e0d52312c2461495baea58ffdef53643217c1d9ab8b5c007"} Oct 09 08:05:21 crc kubenswrapper[4715]: I1009 08:05:21.751891 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b364ad0f-f451-4668-b657-23e9128a0b5f","Type":"ContainerStarted","Data":"ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8"} Oct 09 08:05:21 crc kubenswrapper[4715]: I1009 08:05:21.756276 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b0f42f3a-98f5-442f-a169-3d7080e5fea3","Type":"ContainerStarted","Data":"802ba9b1c95ec79b4bfc771ea371c4fefe45faf295dc5ebb14a5c4170f28ef81"} Oct 09 08:05:21 crc kubenswrapper[4715]: I1009 08:05:21.762138 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:21 crc kubenswrapper[4715]: I1009 08:05:21.800681 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.80065789 podStartE2EDuration="2.80065789s" podCreationTimestamp="2025-10-09 08:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:05:21.787845831 +0000 UTC m=+1152.480649839" watchObservedRunningTime="2025-10-09 08:05:21.80065789 +0000 UTC m=+1152.493461908" Oct 09 08:05:22 crc kubenswrapper[4715]: I1009 08:05:22.765993 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b364ad0f-f451-4668-b657-23e9128a0b5f","Type":"ContainerStarted","Data":"37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db"} Oct 09 08:05:22 crc kubenswrapper[4715]: I1009 08:05:22.766377 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b364ad0f-f451-4668-b657-23e9128a0b5f","Type":"ContainerStarted","Data":"f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd"} Oct 09 08:05:24 crc kubenswrapper[4715]: I1009 08:05:24.784923 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b364ad0f-f451-4668-b657-23e9128a0b5f","Type":"ContainerStarted","Data":"4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60"} Oct 09 08:05:24 crc kubenswrapper[4715]: I1009 08:05:24.785514 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 08:05:24 crc kubenswrapper[4715]: I1009 08:05:24.818190 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.727016447 podStartE2EDuration="5.818164703s" podCreationTimestamp="2025-10-09 08:05:19 +0000 UTC" firstStartedPulling="2025-10-09 08:05:20.586964878 +0000 UTC m=+1151.279768886" lastFinishedPulling="2025-10-09 08:05:23.678113134 +0000 UTC m=+1154.370917142" observedRunningTime="2025-10-09 08:05:24.804355435 +0000 UTC m=+1155.497159443" watchObservedRunningTime="2025-10-09 08:05:24.818164703 +0000 UTC m=+1155.510968721" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.190797 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.652054 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-bmknt"] Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.653562 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.657204 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.657362 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.664581 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bmknt"] Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.775981 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-scripts\") pod \"nova-cell0-cell-mapping-bmknt\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.776035 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv55p\" (UniqueName: \"kubernetes.io/projected/a0884103-2ca9-41fa-94ab-19ce6ba49364-kube-api-access-rv55p\") pod \"nova-cell0-cell-mapping-bmknt\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.776093 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-config-data\") pod \"nova-cell0-cell-mapping-bmknt\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.776157 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bmknt\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.812965 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.816297 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.822956 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.837345 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.839293 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.841563 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.879490 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.880479 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv55p\" (UniqueName: \"kubernetes.io/projected/a0884103-2ca9-41fa-94ab-19ce6ba49364-kube-api-access-rv55p\") pod \"nova-cell0-cell-mapping-bmknt\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.880513 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b321002e-eb01-4e98-902e-7e3848db47a3-config-data\") pod \"nova-api-0\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " pod="openstack/nova-api-0" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.880573 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-config-data\") pod \"nova-cell0-cell-mapping-bmknt\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.880591 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrqfw\" (UniqueName: \"kubernetes.io/projected/b321002e-eb01-4e98-902e-7e3848db47a3-kube-api-access-qrqfw\") pod \"nova-api-0\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " pod="openstack/nova-api-0" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.880613 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b321002e-eb01-4e98-902e-7e3848db47a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " pod="openstack/nova-api-0" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.880669 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bmknt\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.880702 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b321002e-eb01-4e98-902e-7e3848db47a3-logs\") pod \"nova-api-0\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " pod="openstack/nova-api-0" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.880734 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-scripts\") pod \"nova-cell0-cell-mapping-bmknt\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.894281 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-scripts\") pod \"nova-cell0-cell-mapping-bmknt\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.894373 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-config-data\") pod \"nova-cell0-cell-mapping-bmknt\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.900300 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bmknt\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.945055 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.971724 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv55p\" (UniqueName: \"kubernetes.io/projected/a0884103-2ca9-41fa-94ab-19ce6ba49364-kube-api-access-rv55p\") pod \"nova-cell0-cell-mapping-bmknt\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:25 crc kubenswrapper[4715]: I1009 08:05:25.972314 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:25.999937 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrqfw\" (UniqueName: \"kubernetes.io/projected/b321002e-eb01-4e98-902e-7e3848db47a3-kube-api-access-qrqfw\") pod \"nova-api-0\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " pod="openstack/nova-api-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.000068 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b321002e-eb01-4e98-902e-7e3848db47a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " pod="openstack/nova-api-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.000116 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-config-data\") pod \"nova-metadata-0\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.000356 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b321002e-eb01-4e98-902e-7e3848db47a3-logs\") pod \"nova-api-0\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " pod="openstack/nova-api-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.000564 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz55q\" (UniqueName: \"kubernetes.io/projected/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-kube-api-access-bz55q\") pod \"nova-metadata-0\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.000607 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-logs\") pod \"nova-metadata-0\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.000651 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b321002e-eb01-4e98-902e-7e3848db47a3-config-data\") pod \"nova-api-0\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " pod="openstack/nova-api-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.000690 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.002232 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b321002e-eb01-4e98-902e-7e3848db47a3-logs\") pod \"nova-api-0\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " pod="openstack/nova-api-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.022498 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b321002e-eb01-4e98-902e-7e3848db47a3-config-data\") pod \"nova-api-0\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " pod="openstack/nova-api-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.028880 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b321002e-eb01-4e98-902e-7e3848db47a3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " pod="openstack/nova-api-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.060734 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.062069 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.067825 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrqfw\" (UniqueName: \"kubernetes.io/projected/b321002e-eb01-4e98-902e-7e3848db47a3-kube-api-access-qrqfw\") pod \"nova-api-0\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " pod="openstack/nova-api-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.068205 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.076015 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-fp7xt"] Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.080411 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.103132 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-config-data\") pod \"nova-metadata-0\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.103212 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee7419-a80a-489f-8f06-7b4188803d03-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5eee7419-a80a-489f-8f06-7b4188803d03\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.103321 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rltb7\" (UniqueName: \"kubernetes.io/projected/5eee7419-a80a-489f-8f06-7b4188803d03-kube-api-access-rltb7\") pod \"nova-scheduler-0\" (UID: \"5eee7419-a80a-489f-8f06-7b4188803d03\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.103367 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz55q\" (UniqueName: \"kubernetes.io/projected/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-kube-api-access-bz55q\") pod \"nova-metadata-0\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.103395 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-logs\") pod \"nova-metadata-0\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.104037 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.104134 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee7419-a80a-489f-8f06-7b4188803d03-config-data\") pod \"nova-scheduler-0\" (UID: \"5eee7419-a80a-489f-8f06-7b4188803d03\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.110861 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-logs\") pod \"nova-metadata-0\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.111359 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.099011 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.156327 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-config-data\") pod \"nova-metadata-0\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.177167 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.174630 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz55q\" (UniqueName: \"kubernetes.io/projected/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-kube-api-access-bz55q\") pod \"nova-metadata-0\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.222870 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-fp7xt"] Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.247261 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-config\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.247787 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.247837 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.248068 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rltb7\" (UniqueName: \"kubernetes.io/projected/5eee7419-a80a-489f-8f06-7b4188803d03-kube-api-access-rltb7\") pod \"nova-scheduler-0\" (UID: \"5eee7419-a80a-489f-8f06-7b4188803d03\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.248336 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee7419-a80a-489f-8f06-7b4188803d03-config-data\") pod \"nova-scheduler-0\" (UID: \"5eee7419-a80a-489f-8f06-7b4188803d03\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.248480 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.248604 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s28lr\" (UniqueName: \"kubernetes.io/projected/70aefddd-4fff-4560-a534-52b0e9ea0f8f-kube-api-access-s28lr\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.248676 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee7419-a80a-489f-8f06-7b4188803d03-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5eee7419-a80a-489f-8f06-7b4188803d03\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.248745 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.255743 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee7419-a80a-489f-8f06-7b4188803d03-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5eee7419-a80a-489f-8f06-7b4188803d03\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.255844 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee7419-a80a-489f-8f06-7b4188803d03-config-data\") pod \"nova-scheduler-0\" (UID: \"5eee7419-a80a-489f-8f06-7b4188803d03\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.275575 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rltb7\" (UniqueName: \"kubernetes.io/projected/5eee7419-a80a-489f-8f06-7b4188803d03-kube-api-access-rltb7\") pod \"nova-scheduler-0\" (UID: \"5eee7419-a80a-489f-8f06-7b4188803d03\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.295600 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.297087 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.302231 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.303499 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.305183 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.351443 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.351492 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.351673 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.351709 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s28lr\" (UniqueName: \"kubernetes.io/projected/70aefddd-4fff-4560-a534-52b0e9ea0f8f-kube-api-access-s28lr\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.351767 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.351792 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-config\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.352734 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-config\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.353868 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.354543 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.356460 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.358786 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.399692 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s28lr\" (UniqueName: \"kubernetes.io/projected/70aefddd-4fff-4560-a534-52b0e9ea0f8f-kube-api-access-s28lr\") pod \"dnsmasq-dns-845d6d6f59-fp7xt\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.453864 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.453904 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.453928 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phs7g\" (UniqueName: \"kubernetes.io/projected/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-kube-api-access-phs7g\") pod \"nova-cell1-novncproxy-0\" (UID: \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.507894 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.541202 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.558726 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.558791 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.558820 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phs7g\" (UniqueName: \"kubernetes.io/projected/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-kube-api-access-phs7g\") pod \"nova-cell1-novncproxy-0\" (UID: \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.571080 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.574787 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.582976 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phs7g\" (UniqueName: \"kubernetes.io/projected/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-kube-api-access-phs7g\") pod \"nova-cell1-novncproxy-0\" (UID: \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.636470 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.691171 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bmknt"] Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.886309 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.925825 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.933241 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bmknt" event={"ID":"a0884103-2ca9-41fa-94ab-19ce6ba49364","Type":"ContainerStarted","Data":"b59f2b7d73a0eadbda8361a6028dcfb86e7a5b6cfe928b35750c7dadea8fd40a"} Oct 09 08:05:26 crc kubenswrapper[4715]: I1009 08:05:26.936441 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b321002e-eb01-4e98-902e-7e3848db47a3","Type":"ContainerStarted","Data":"386dbaa291f4870c32ccad6d2a0a4cbc6a8ab874c418a5dfc07ec4064223e735"} Oct 09 08:05:26 crc kubenswrapper[4715]: W1009 08:05:26.942922 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4edd5c7_930b_4aa0_9c81_50ba417a5c89.slice/crio-057ac49bce4171b2a1741720bf48486d81b95ac1fd789242d90dcd99b27b7b0d WatchSource:0}: Error finding container 057ac49bce4171b2a1741720bf48486d81b95ac1fd789242d90dcd99b27b7b0d: Status 404 returned error can't find the container with id 057ac49bce4171b2a1741720bf48486d81b95ac1fd789242d90dcd99b27b7b0d Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.032316 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4tzm2"] Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.034244 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.036698 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.037936 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.059870 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4tzm2"] Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.085364 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4tzm2\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.085948 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5wq2\" (UniqueName: \"kubernetes.io/projected/02682196-5950-427c-908a-bb791173de68-kube-api-access-h5wq2\") pod \"nova-cell1-conductor-db-sync-4tzm2\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.086110 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-scripts\") pod \"nova-cell1-conductor-db-sync-4tzm2\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.086170 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-config-data\") pod \"nova-cell1-conductor-db-sync-4tzm2\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.099180 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.171169 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.187785 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4tzm2\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.187860 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5wq2\" (UniqueName: \"kubernetes.io/projected/02682196-5950-427c-908a-bb791173de68-kube-api-access-h5wq2\") pod \"nova-cell1-conductor-db-sync-4tzm2\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.188548 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-scripts\") pod \"nova-cell1-conductor-db-sync-4tzm2\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.188595 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-config-data\") pod \"nova-cell1-conductor-db-sync-4tzm2\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.197306 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-scripts\") pod \"nova-cell1-conductor-db-sync-4tzm2\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.200212 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-config-data\") pod \"nova-cell1-conductor-db-sync-4tzm2\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.206591 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5wq2\" (UniqueName: \"kubernetes.io/projected/02682196-5950-427c-908a-bb791173de68-kube-api-access-h5wq2\") pod \"nova-cell1-conductor-db-sync-4tzm2\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.211258 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-fp7xt"] Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.233309 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4tzm2\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.434996 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.951495 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9480540d-4b2a-40ea-b63b-e695c8e0a1b5","Type":"ContainerStarted","Data":"620cf1e6e43b698cd27faa2afe51ee7ef6693fd6deff707e1cd895229d14c29f"} Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.959430 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bmknt" event={"ID":"a0884103-2ca9-41fa-94ab-19ce6ba49364","Type":"ContainerStarted","Data":"38e59a25e7417c90e107247d7af630d72c110fe1aebdebd65edc704ec9d7a5ba"} Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.962504 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4edd5c7-930b-4aa0-9c81-50ba417a5c89","Type":"ContainerStarted","Data":"057ac49bce4171b2a1741720bf48486d81b95ac1fd789242d90dcd99b27b7b0d"} Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.965414 4715 generic.go:334] "Generic (PLEG): container finished" podID="70aefddd-4fff-4560-a534-52b0e9ea0f8f" containerID="e51fe84727058b5333e0abf45df1fd9a1f9abdd61d81d89a25db14d36fb3e622" exitCode=0 Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.965541 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" event={"ID":"70aefddd-4fff-4560-a534-52b0e9ea0f8f","Type":"ContainerDied","Data":"e51fe84727058b5333e0abf45df1fd9a1f9abdd61d81d89a25db14d36fb3e622"} Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.965580 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" event={"ID":"70aefddd-4fff-4560-a534-52b0e9ea0f8f","Type":"ContainerStarted","Data":"0c66fe91164a08c5dd0e3e59dcb6d6a6dae99e25af7658e90f8b5c0a19eb4661"} Oct 09 08:05:27 crc kubenswrapper[4715]: I1009 08:05:27.976748 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eee7419-a80a-489f-8f06-7b4188803d03","Type":"ContainerStarted","Data":"064c1a505cf6dd1135d38ad964360da231adc7ab70710b005399a6c8dd6e98db"} Oct 09 08:05:28 crc kubenswrapper[4715]: I1009 08:05:28.005717 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4tzm2"] Oct 09 08:05:28 crc kubenswrapper[4715]: I1009 08:05:28.052239 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-bmknt" podStartSLOduration=3.05221341 podStartE2EDuration="3.05221341s" podCreationTimestamp="2025-10-09 08:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:05:27.9814607 +0000 UTC m=+1158.674264718" watchObservedRunningTime="2025-10-09 08:05:28.05221341 +0000 UTC m=+1158.745017418" Oct 09 08:05:28 crc kubenswrapper[4715]: I1009 08:05:28.989332 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4tzm2" event={"ID":"02682196-5950-427c-908a-bb791173de68","Type":"ContainerStarted","Data":"8b9912c6ccfe6d6703f155d3ed91a94aa1ea0876f4422c1628c663a4cb20ff11"} Oct 09 08:05:28 crc kubenswrapper[4715]: I1009 08:05:28.991491 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4tzm2" event={"ID":"02682196-5950-427c-908a-bb791173de68","Type":"ContainerStarted","Data":"e80cb482810767a1581aba151874126f6f885657b3c1dfd6fc1ac4631f0e9955"} Oct 09 08:05:28 crc kubenswrapper[4715]: I1009 08:05:28.996286 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" event={"ID":"70aefddd-4fff-4560-a534-52b0e9ea0f8f","Type":"ContainerStarted","Data":"1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013"} Oct 09 08:05:29 crc kubenswrapper[4715]: I1009 08:05:29.018382 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-4tzm2" podStartSLOduration=3.018362634 podStartE2EDuration="3.018362634s" podCreationTimestamp="2025-10-09 08:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:05:29.01200105 +0000 UTC m=+1159.704805068" watchObservedRunningTime="2025-10-09 08:05:29.018362634 +0000 UTC m=+1159.711166642" Oct 09 08:05:29 crc kubenswrapper[4715]: I1009 08:05:29.041380 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" podStartSLOduration=3.041361307 podStartE2EDuration="3.041361307s" podCreationTimestamp="2025-10-09 08:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:05:29.033326205 +0000 UTC m=+1159.726130213" watchObservedRunningTime="2025-10-09 08:05:29.041361307 +0000 UTC m=+1159.734165315" Oct 09 08:05:30 crc kubenswrapper[4715]: I1009 08:05:30.005730 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:31 crc kubenswrapper[4715]: I1009 08:05:31.038158 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9480540d-4b2a-40ea-b63b-e695c8e0a1b5","Type":"ContainerStarted","Data":"f57ca25581bd4ae4ec705d3a73279094b06babbe39783df93d6f13176b690e48"} Oct 09 08:05:31 crc kubenswrapper[4715]: I1009 08:05:31.042857 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4edd5c7-930b-4aa0-9c81-50ba417a5c89","Type":"ContainerStarted","Data":"995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2"} Oct 09 08:05:31 crc kubenswrapper[4715]: I1009 08:05:31.046014 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b321002e-eb01-4e98-902e-7e3848db47a3","Type":"ContainerStarted","Data":"a0ea2d1fb74d6ffa353f561d3261ce341ddec99a826e95b64e2dbbe22e3aa520"} Oct 09 08:05:31 crc kubenswrapper[4715]: I1009 08:05:31.050256 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eee7419-a80a-489f-8f06-7b4188803d03","Type":"ContainerStarted","Data":"971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9"} Oct 09 08:05:31 crc kubenswrapper[4715]: I1009 08:05:31.062983 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.7870515230000001 podStartE2EDuration="5.062968378s" podCreationTimestamp="2025-10-09 08:05:26 +0000 UTC" firstStartedPulling="2025-10-09 08:05:27.182796047 +0000 UTC m=+1157.875600055" lastFinishedPulling="2025-10-09 08:05:30.458712902 +0000 UTC m=+1161.151516910" observedRunningTime="2025-10-09 08:05:31.053979949 +0000 UTC m=+1161.746783957" watchObservedRunningTime="2025-10-09 08:05:31.062968378 +0000 UTC m=+1161.755772386" Oct 09 08:05:31 crc kubenswrapper[4715]: I1009 08:05:31.081263 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.7328798810000001 podStartE2EDuration="5.081247806s" podCreationTimestamp="2025-10-09 08:05:26 +0000 UTC" firstStartedPulling="2025-10-09 08:05:27.102307305 +0000 UTC m=+1157.795111313" lastFinishedPulling="2025-10-09 08:05:30.45067523 +0000 UTC m=+1161.143479238" observedRunningTime="2025-10-09 08:05:31.075720736 +0000 UTC m=+1161.768524744" watchObservedRunningTime="2025-10-09 08:05:31.081247806 +0000 UTC m=+1161.774051814" Oct 09 08:05:31 crc kubenswrapper[4715]: I1009 08:05:31.203595 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 08:05:31 crc kubenswrapper[4715]: I1009 08:05:31.226736 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:31 crc kubenswrapper[4715]: I1009 08:05:31.508696 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 08:05:31 crc kubenswrapper[4715]: I1009 08:05:31.637579 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.060201 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4edd5c7-930b-4aa0-9c81-50ba417a5c89","Type":"ContainerStarted","Data":"58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b"} Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.060388 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b4edd5c7-930b-4aa0-9c81-50ba417a5c89" containerName="nova-metadata-log" containerID="cri-o://995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2" gracePeriod=30 Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.060466 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b4edd5c7-930b-4aa0-9c81-50ba417a5c89" containerName="nova-metadata-metadata" containerID="cri-o://58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b" gracePeriod=30 Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.064026 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b321002e-eb01-4e98-902e-7e3848db47a3","Type":"ContainerStarted","Data":"ceb324ebe44a15b3b8c6819efdd0519a27f9737ffcafbca1335183e06e183b8f"} Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.094151 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.588654571 podStartE2EDuration="7.094125447s" podCreationTimestamp="2025-10-09 08:05:25 +0000 UTC" firstStartedPulling="2025-10-09 08:05:26.949104867 +0000 UTC m=+1157.641908875" lastFinishedPulling="2025-10-09 08:05:30.454575733 +0000 UTC m=+1161.147379751" observedRunningTime="2025-10-09 08:05:32.081992747 +0000 UTC m=+1162.774796755" watchObservedRunningTime="2025-10-09 08:05:32.094125447 +0000 UTC m=+1162.786929455" Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.100807 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.565828223 podStartE2EDuration="7.100789889s" podCreationTimestamp="2025-10-09 08:05:25 +0000 UTC" firstStartedPulling="2025-10-09 08:05:26.918165955 +0000 UTC m=+1157.610969963" lastFinishedPulling="2025-10-09 08:05:30.453127621 +0000 UTC m=+1161.145931629" observedRunningTime="2025-10-09 08:05:32.099708227 +0000 UTC m=+1162.792512265" watchObservedRunningTime="2025-10-09 08:05:32.100789889 +0000 UTC m=+1162.793593897" Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.709520 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.815067 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz55q\" (UniqueName: \"kubernetes.io/projected/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-kube-api-access-bz55q\") pod \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.815169 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-logs\") pod \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.815214 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-config-data\") pod \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.815366 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-combined-ca-bundle\") pod \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\" (UID: \"b4edd5c7-930b-4aa0-9c81-50ba417a5c89\") " Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.816258 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-logs" (OuterVolumeSpecName: "logs") pod "b4edd5c7-930b-4aa0-9c81-50ba417a5c89" (UID: "b4edd5c7-930b-4aa0-9c81-50ba417a5c89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.816375 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.835844 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-kube-api-access-bz55q" (OuterVolumeSpecName: "kube-api-access-bz55q") pod "b4edd5c7-930b-4aa0-9c81-50ba417a5c89" (UID: "b4edd5c7-930b-4aa0-9c81-50ba417a5c89"). InnerVolumeSpecName "kube-api-access-bz55q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.858543 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-config-data" (OuterVolumeSpecName: "config-data") pod "b4edd5c7-930b-4aa0-9c81-50ba417a5c89" (UID: "b4edd5c7-930b-4aa0-9c81-50ba417a5c89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.864932 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4edd5c7-930b-4aa0-9c81-50ba417a5c89" (UID: "b4edd5c7-930b-4aa0-9c81-50ba417a5c89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.918666 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.918716 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz55q\" (UniqueName: \"kubernetes.io/projected/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-kube-api-access-bz55q\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:32 crc kubenswrapper[4715]: I1009 08:05:32.918734 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4edd5c7-930b-4aa0-9c81-50ba417a5c89-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.090457 4715 generic.go:334] "Generic (PLEG): container finished" podID="b4edd5c7-930b-4aa0-9c81-50ba417a5c89" containerID="58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b" exitCode=0 Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.090489 4715 generic.go:334] "Generic (PLEG): container finished" podID="b4edd5c7-930b-4aa0-9c81-50ba417a5c89" containerID="995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2" exitCode=143 Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.090529 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.090584 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4edd5c7-930b-4aa0-9c81-50ba417a5c89","Type":"ContainerDied","Data":"58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b"} Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.090616 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4edd5c7-930b-4aa0-9c81-50ba417a5c89","Type":"ContainerDied","Data":"995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2"} Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.090627 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4edd5c7-930b-4aa0-9c81-50ba417a5c89","Type":"ContainerDied","Data":"057ac49bce4171b2a1741720bf48486d81b95ac1fd789242d90dcd99b27b7b0d"} Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.090640 4715 scope.go:117] "RemoveContainer" containerID="58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.090811 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="9480540d-4b2a-40ea-b63b-e695c8e0a1b5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f57ca25581bd4ae4ec705d3a73279094b06babbe39783df93d6f13176b690e48" gracePeriod=30 Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.125702 4715 scope.go:117] "RemoveContainer" containerID="995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.146885 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.154033 4715 scope.go:117] "RemoveContainer" containerID="58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b" Oct 09 08:05:33 crc kubenswrapper[4715]: E1009 08:05:33.154461 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b\": container with ID starting with 58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b not found: ID does not exist" containerID="58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.154492 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b"} err="failed to get container status \"58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b\": rpc error: code = NotFound desc = could not find container \"58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b\": container with ID starting with 58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b not found: ID does not exist" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.154514 4715 scope.go:117] "RemoveContainer" containerID="995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2" Oct 09 08:05:33 crc kubenswrapper[4715]: E1009 08:05:33.155030 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2\": container with ID starting with 995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2 not found: ID does not exist" containerID="995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.155057 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2"} err="failed to get container status \"995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2\": rpc error: code = NotFound desc = could not find container \"995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2\": container with ID starting with 995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2 not found: ID does not exist" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.155070 4715 scope.go:117] "RemoveContainer" containerID="58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.155252 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b"} err="failed to get container status \"58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b\": rpc error: code = NotFound desc = could not find container \"58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b\": container with ID starting with 58b8a1f690e8114e30439452d9ba226fb957e5fd5f64e40a4b85a538287cec2b not found: ID does not exist" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.155269 4715 scope.go:117] "RemoveContainer" containerID="995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.155603 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2"} err="failed to get container status \"995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2\": rpc error: code = NotFound desc = could not find container \"995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2\": container with ID starting with 995c9d74a9e72ca8e7f652a17b13e173f023ab0b2984ae07cc79762c31c9afe2 not found: ID does not exist" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.170254 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.177532 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:33 crc kubenswrapper[4715]: E1009 08:05:33.178000 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4edd5c7-930b-4aa0-9c81-50ba417a5c89" containerName="nova-metadata-metadata" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.178014 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4edd5c7-930b-4aa0-9c81-50ba417a5c89" containerName="nova-metadata-metadata" Oct 09 08:05:33 crc kubenswrapper[4715]: E1009 08:05:33.178049 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4edd5c7-930b-4aa0-9c81-50ba417a5c89" containerName="nova-metadata-log" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.178056 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4edd5c7-930b-4aa0-9c81-50ba417a5c89" containerName="nova-metadata-log" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.178232 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4edd5c7-930b-4aa0-9c81-50ba417a5c89" containerName="nova-metadata-metadata" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.178253 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4edd5c7-930b-4aa0-9c81-50ba417a5c89" containerName="nova-metadata-log" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.179260 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.184081 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.184375 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.191132 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.228913 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-logs\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.229033 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.229057 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-config-data\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.229178 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.229288 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhckw\" (UniqueName: \"kubernetes.io/projected/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-kube-api-access-vhckw\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.330844 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.331175 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhckw\" (UniqueName: \"kubernetes.io/projected/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-kube-api-access-vhckw\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.331229 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-logs\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.331265 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.331280 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-config-data\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.332467 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-logs\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.337121 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.345995 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.346501 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-config-data\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.350086 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhckw\" (UniqueName: \"kubernetes.io/projected/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-kube-api-access-vhckw\") pod \"nova-metadata-0\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.509520 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:05:33 crc kubenswrapper[4715]: W1009 08:05:33.972841 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda65cbc84_6036_46d8_b2fc_37d15b7c7fa8.slice/crio-c30b53df774abda8de6495ff4e5e6fb4939703efa14b4d4c9deb69ca0f403607 WatchSource:0}: Error finding container c30b53df774abda8de6495ff4e5e6fb4939703efa14b4d4c9deb69ca0f403607: Status 404 returned error can't find the container with id c30b53df774abda8de6495ff4e5e6fb4939703efa14b4d4c9deb69ca0f403607 Oct 09 08:05:33 crc kubenswrapper[4715]: I1009 08:05:33.978045 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:34 crc kubenswrapper[4715]: I1009 08:05:34.099947 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8","Type":"ContainerStarted","Data":"c30b53df774abda8de6495ff4e5e6fb4939703efa14b4d4c9deb69ca0f403607"} Oct 09 08:05:34 crc kubenswrapper[4715]: I1009 08:05:34.146847 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4edd5c7-930b-4aa0-9c81-50ba417a5c89" path="/var/lib/kubelet/pods/b4edd5c7-930b-4aa0-9c81-50ba417a5c89/volumes" Oct 09 08:05:35 crc kubenswrapper[4715]: I1009 08:05:35.113038 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8","Type":"ContainerStarted","Data":"3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972"} Oct 09 08:05:35 crc kubenswrapper[4715]: I1009 08:05:35.113841 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8","Type":"ContainerStarted","Data":"b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc"} Oct 09 08:05:35 crc kubenswrapper[4715]: I1009 08:05:35.135384 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.135367543 podStartE2EDuration="2.135367543s" podCreationTimestamp="2025-10-09 08:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:05:35.129560886 +0000 UTC m=+1165.822364904" watchObservedRunningTime="2025-10-09 08:05:35.135367543 +0000 UTC m=+1165.828171551" Oct 09 08:05:36 crc kubenswrapper[4715]: I1009 08:05:36.134762 4715 generic.go:334] "Generic (PLEG): container finished" podID="a0884103-2ca9-41fa-94ab-19ce6ba49364" containerID="38e59a25e7417c90e107247d7af630d72c110fe1aebdebd65edc704ec9d7a5ba" exitCode=0 Oct 09 08:05:36 crc kubenswrapper[4715]: I1009 08:05:36.136776 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bmknt" event={"ID":"a0884103-2ca9-41fa-94ab-19ce6ba49364","Type":"ContainerDied","Data":"38e59a25e7417c90e107247d7af630d72c110fe1aebdebd65edc704ec9d7a5ba"} Oct 09 08:05:36 crc kubenswrapper[4715]: I1009 08:05:36.143960 4715 generic.go:334] "Generic (PLEG): container finished" podID="02682196-5950-427c-908a-bb791173de68" containerID="8b9912c6ccfe6d6703f155d3ed91a94aa1ea0876f4422c1628c663a4cb20ff11" exitCode=0 Oct 09 08:05:36 crc kubenswrapper[4715]: I1009 08:05:36.164528 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4tzm2" event={"ID":"02682196-5950-427c-908a-bb791173de68","Type":"ContainerDied","Data":"8b9912c6ccfe6d6703f155d3ed91a94aa1ea0876f4422c1628c663a4cb20ff11"} Oct 09 08:05:36 crc kubenswrapper[4715]: I1009 08:05:36.181108 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 08:05:36 crc kubenswrapper[4715]: I1009 08:05:36.182066 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 08:05:36 crc kubenswrapper[4715]: I1009 08:05:36.509365 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 08:05:36 crc kubenswrapper[4715]: I1009 08:05:36.541378 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 08:05:36 crc kubenswrapper[4715]: I1009 08:05:36.543573 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:05:36 crc kubenswrapper[4715]: I1009 08:05:36.637014 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-5kjqv"] Oct 09 08:05:36 crc kubenswrapper[4715]: I1009 08:05:36.637605 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" podUID="37af9a61-ef4d-476b-978d-ca780888d042" containerName="dnsmasq-dns" containerID="cri-o://edc3751b9ca29e525c3cfbf9387b6fcbcca2c661e93118c7132634c8144fb48c" gracePeriod=10 Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.156993 4715 generic.go:334] "Generic (PLEG): container finished" podID="37af9a61-ef4d-476b-978d-ca780888d042" containerID="edc3751b9ca29e525c3cfbf9387b6fcbcca2c661e93118c7132634c8144fb48c" exitCode=0 Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.158124 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" event={"ID":"37af9a61-ef4d-476b-978d-ca780888d042","Type":"ContainerDied","Data":"edc3751b9ca29e525c3cfbf9387b6fcbcca2c661e93118c7132634c8144fb48c"} Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.158170 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" event={"ID":"37af9a61-ef4d-476b-978d-ca780888d042","Type":"ContainerDied","Data":"bc056ea56d8632c6e21762e19a1d549401cb76159772e85196da8a5db2a320bb"} Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.158187 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc056ea56d8632c6e21762e19a1d549401cb76159772e85196da8a5db2a320bb" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.191597 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.200677 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.263044 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b321002e-eb01-4e98-902e-7e3848db47a3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.263445 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b321002e-eb01-4e98-902e-7e3848db47a3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.187:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.305722 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-ovsdbserver-nb\") pod \"37af9a61-ef4d-476b-978d-ca780888d042\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.305799 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-dns-svc\") pod \"37af9a61-ef4d-476b-978d-ca780888d042\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.305828 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-dns-swift-storage-0\") pod \"37af9a61-ef4d-476b-978d-ca780888d042\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.305868 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmvcn\" (UniqueName: \"kubernetes.io/projected/37af9a61-ef4d-476b-978d-ca780888d042-kube-api-access-fmvcn\") pod \"37af9a61-ef4d-476b-978d-ca780888d042\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.306007 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-config\") pod \"37af9a61-ef4d-476b-978d-ca780888d042\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.306149 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-ovsdbserver-sb\") pod \"37af9a61-ef4d-476b-978d-ca780888d042\" (UID: \"37af9a61-ef4d-476b-978d-ca780888d042\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.365136 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37af9a61-ef4d-476b-978d-ca780888d042-kube-api-access-fmvcn" (OuterVolumeSpecName: "kube-api-access-fmvcn") pod "37af9a61-ef4d-476b-978d-ca780888d042" (UID: "37af9a61-ef4d-476b-978d-ca780888d042"). InnerVolumeSpecName "kube-api-access-fmvcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.372585 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "37af9a61-ef4d-476b-978d-ca780888d042" (UID: "37af9a61-ef4d-476b-978d-ca780888d042"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.409246 4715 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.409323 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmvcn\" (UniqueName: \"kubernetes.io/projected/37af9a61-ef4d-476b-978d-ca780888d042-kube-api-access-fmvcn\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.418468 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37af9a61-ef4d-476b-978d-ca780888d042" (UID: "37af9a61-ef4d-476b-978d-ca780888d042"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.423784 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37af9a61-ef4d-476b-978d-ca780888d042" (UID: "37af9a61-ef4d-476b-978d-ca780888d042"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.447219 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-config" (OuterVolumeSpecName: "config") pod "37af9a61-ef4d-476b-978d-ca780888d042" (UID: "37af9a61-ef4d-476b-978d-ca780888d042"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.454179 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37af9a61-ef4d-476b-978d-ca780888d042" (UID: "37af9a61-ef4d-476b-978d-ca780888d042"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.511392 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.511453 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.511475 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.511486 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37af9a61-ef4d-476b-978d-ca780888d042-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.607315 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.612292 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv55p\" (UniqueName: \"kubernetes.io/projected/a0884103-2ca9-41fa-94ab-19ce6ba49364-kube-api-access-rv55p\") pod \"a0884103-2ca9-41fa-94ab-19ce6ba49364\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.612537 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-combined-ca-bundle\") pod \"a0884103-2ca9-41fa-94ab-19ce6ba49364\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.612651 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-config-data\") pod \"a0884103-2ca9-41fa-94ab-19ce6ba49364\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.612678 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-scripts\") pod \"a0884103-2ca9-41fa-94ab-19ce6ba49364\" (UID: \"a0884103-2ca9-41fa-94ab-19ce6ba49364\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.624050 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-scripts" (OuterVolumeSpecName: "scripts") pod "a0884103-2ca9-41fa-94ab-19ce6ba49364" (UID: "a0884103-2ca9-41fa-94ab-19ce6ba49364"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.624066 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0884103-2ca9-41fa-94ab-19ce6ba49364-kube-api-access-rv55p" (OuterVolumeSpecName: "kube-api-access-rv55p") pod "a0884103-2ca9-41fa-94ab-19ce6ba49364" (UID: "a0884103-2ca9-41fa-94ab-19ce6ba49364"). InnerVolumeSpecName "kube-api-access-rv55p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.662752 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0884103-2ca9-41fa-94ab-19ce6ba49364" (UID: "a0884103-2ca9-41fa-94ab-19ce6ba49364"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.674832 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-config-data" (OuterVolumeSpecName: "config-data") pod "a0884103-2ca9-41fa-94ab-19ce6ba49364" (UID: "a0884103-2ca9-41fa-94ab-19ce6ba49364"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.686248 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.715576 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5wq2\" (UniqueName: \"kubernetes.io/projected/02682196-5950-427c-908a-bb791173de68-kube-api-access-h5wq2\") pod \"02682196-5950-427c-908a-bb791173de68\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.715744 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-combined-ca-bundle\") pod \"02682196-5950-427c-908a-bb791173de68\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.715809 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-scripts\") pod \"02682196-5950-427c-908a-bb791173de68\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.715850 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-config-data\") pod \"02682196-5950-427c-908a-bb791173de68\" (UID: \"02682196-5950-427c-908a-bb791173de68\") " Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.716319 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.716337 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.716345 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0884103-2ca9-41fa-94ab-19ce6ba49364-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.716355 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv55p\" (UniqueName: \"kubernetes.io/projected/a0884103-2ca9-41fa-94ab-19ce6ba49364-kube-api-access-rv55p\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.719845 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02682196-5950-427c-908a-bb791173de68-kube-api-access-h5wq2" (OuterVolumeSpecName: "kube-api-access-h5wq2") pod "02682196-5950-427c-908a-bb791173de68" (UID: "02682196-5950-427c-908a-bb791173de68"). InnerVolumeSpecName "kube-api-access-h5wq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.734242 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-scripts" (OuterVolumeSpecName: "scripts") pod "02682196-5950-427c-908a-bb791173de68" (UID: "02682196-5950-427c-908a-bb791173de68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.759819 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-config-data" (OuterVolumeSpecName: "config-data") pod "02682196-5950-427c-908a-bb791173de68" (UID: "02682196-5950-427c-908a-bb791173de68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.763713 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02682196-5950-427c-908a-bb791173de68" (UID: "02682196-5950-427c-908a-bb791173de68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.818596 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.818640 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.818656 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02682196-5950-427c-908a-bb791173de68-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:37 crc kubenswrapper[4715]: I1009 08:05:37.818670 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5wq2\" (UniqueName: \"kubernetes.io/projected/02682196-5950-427c-908a-bb791173de68-kube-api-access-h5wq2\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.173406 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4tzm2" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.174372 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4tzm2" event={"ID":"02682196-5950-427c-908a-bb791173de68","Type":"ContainerDied","Data":"e80cb482810767a1581aba151874126f6f885657b3c1dfd6fc1ac4631f0e9955"} Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.174453 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e80cb482810767a1581aba151874126f6f885657b3c1dfd6fc1ac4631f0e9955" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.182251 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bmknt" event={"ID":"a0884103-2ca9-41fa-94ab-19ce6ba49364","Type":"ContainerDied","Data":"b59f2b7d73a0eadbda8361a6028dcfb86e7a5b6cfe928b35750c7dadea8fd40a"} Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.182293 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b59f2b7d73a0eadbda8361a6028dcfb86e7a5b6cfe928b35750c7dadea8fd40a" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.182360 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-5kjqv" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.182380 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bmknt" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.228897 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-5kjqv"] Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.300494 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-5kjqv"] Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.313752 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 08:05:38 crc kubenswrapper[4715]: E1009 08:05:38.314378 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37af9a61-ef4d-476b-978d-ca780888d042" containerName="init" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.314483 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="37af9a61-ef4d-476b-978d-ca780888d042" containerName="init" Oct 09 08:05:38 crc kubenswrapper[4715]: E1009 08:05:38.314553 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0884103-2ca9-41fa-94ab-19ce6ba49364" containerName="nova-manage" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.314610 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0884103-2ca9-41fa-94ab-19ce6ba49364" containerName="nova-manage" Oct 09 08:05:38 crc kubenswrapper[4715]: E1009 08:05:38.314665 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02682196-5950-427c-908a-bb791173de68" containerName="nova-cell1-conductor-db-sync" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.314714 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="02682196-5950-427c-908a-bb791173de68" containerName="nova-cell1-conductor-db-sync" Oct 09 08:05:38 crc kubenswrapper[4715]: E1009 08:05:38.314779 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37af9a61-ef4d-476b-978d-ca780888d042" containerName="dnsmasq-dns" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.314834 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="37af9a61-ef4d-476b-978d-ca780888d042" containerName="dnsmasq-dns" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.315122 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0884103-2ca9-41fa-94ab-19ce6ba49364" containerName="nova-manage" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.315212 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="37af9a61-ef4d-476b-978d-ca780888d042" containerName="dnsmasq-dns" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.315297 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="02682196-5950-427c-908a-bb791173de68" containerName="nova-cell1-conductor-db-sync" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.316137 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.323376 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.326122 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2c8b523-67fd-40d3-9e2b-eb68619f60bc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f2c8b523-67fd-40d3-9e2b-eb68619f60bc\") " pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.326281 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c8b523-67fd-40d3-9e2b-eb68619f60bc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f2c8b523-67fd-40d3-9e2b-eb68619f60bc\") " pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.321644 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.326475 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdf7p\" (UniqueName: \"kubernetes.io/projected/f2c8b523-67fd-40d3-9e2b-eb68619f60bc-kube-api-access-pdf7p\") pod \"nova-cell1-conductor-0\" (UID: \"f2c8b523-67fd-40d3-9e2b-eb68619f60bc\") " pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.428649 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2c8b523-67fd-40d3-9e2b-eb68619f60bc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f2c8b523-67fd-40d3-9e2b-eb68619f60bc\") " pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.428741 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c8b523-67fd-40d3-9e2b-eb68619f60bc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f2c8b523-67fd-40d3-9e2b-eb68619f60bc\") " pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.428776 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdf7p\" (UniqueName: \"kubernetes.io/projected/f2c8b523-67fd-40d3-9e2b-eb68619f60bc-kube-api-access-pdf7p\") pod \"nova-cell1-conductor-0\" (UID: \"f2c8b523-67fd-40d3-9e2b-eb68619f60bc\") " pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.437918 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2c8b523-67fd-40d3-9e2b-eb68619f60bc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f2c8b523-67fd-40d3-9e2b-eb68619f60bc\") " pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.438056 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c8b523-67fd-40d3-9e2b-eb68619f60bc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f2c8b523-67fd-40d3-9e2b-eb68619f60bc\") " pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.464736 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdf7p\" (UniqueName: \"kubernetes.io/projected/f2c8b523-67fd-40d3-9e2b-eb68619f60bc-kube-api-access-pdf7p\") pod \"nova-cell1-conductor-0\" (UID: \"f2c8b523-67fd-40d3-9e2b-eb68619f60bc\") " pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.471779 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.512517 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.513606 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.522491 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.523328 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b321002e-eb01-4e98-902e-7e3848db47a3" containerName="nova-api-api" containerID="cri-o://ceb324ebe44a15b3b8c6819efdd0519a27f9737ffcafbca1335183e06e183b8f" gracePeriod=30 Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.522994 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b321002e-eb01-4e98-902e-7e3848db47a3" containerName="nova-api-log" containerID="cri-o://a0ea2d1fb74d6ffa353f561d3261ce341ddec99a826e95b64e2dbbe22e3aa520" gracePeriod=30 Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.549918 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:38 crc kubenswrapper[4715]: I1009 08:05:38.648815 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:39 crc kubenswrapper[4715]: I1009 08:05:39.197178 4715 generic.go:334] "Generic (PLEG): container finished" podID="b321002e-eb01-4e98-902e-7e3848db47a3" containerID="a0ea2d1fb74d6ffa353f561d3261ce341ddec99a826e95b64e2dbbe22e3aa520" exitCode=143 Oct 09 08:05:39 crc kubenswrapper[4715]: I1009 08:05:39.197363 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b321002e-eb01-4e98-902e-7e3848db47a3","Type":"ContainerDied","Data":"a0ea2d1fb74d6ffa353f561d3261ce341ddec99a826e95b64e2dbbe22e3aa520"} Oct 09 08:05:39 crc kubenswrapper[4715]: I1009 08:05:39.197551 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5eee7419-a80a-489f-8f06-7b4188803d03" containerName="nova-scheduler-scheduler" containerID="cri-o://971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9" gracePeriod=30 Oct 09 08:05:39 crc kubenswrapper[4715]: I1009 08:05:39.309392 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.148339 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37af9a61-ef4d-476b-978d-ca780888d042" path="/var/lib/kubelet/pods/37af9a61-ef4d-476b-978d-ca780888d042/volumes" Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.207826 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f2c8b523-67fd-40d3-9e2b-eb68619f60bc","Type":"ContainerStarted","Data":"ccfaa6c56bb7c8593a2d819e07a3b89fad0aa5b8293928970d8eea72a12ea980"} Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.208647 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f2c8b523-67fd-40d3-9e2b-eb68619f60bc","Type":"ContainerStarted","Data":"1ede77122c7388b0c47fe53ef94daf36596e20f009d4e74e09f1afe051850297"} Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.208128 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" containerName="nova-metadata-metadata" containerID="cri-o://3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972" gracePeriod=30 Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.207931 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" containerName="nova-metadata-log" containerID="cri-o://b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc" gracePeriod=30 Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.233279 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.233259773 podStartE2EDuration="2.233259773s" podCreationTimestamp="2025-10-09 08:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:05:40.226013274 +0000 UTC m=+1170.918817282" watchObservedRunningTime="2025-10-09 08:05:40.233259773 +0000 UTC m=+1170.926063781" Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.776155 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.978225 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-config-data\") pod \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.978979 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhckw\" (UniqueName: \"kubernetes.io/projected/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-kube-api-access-vhckw\") pod \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.979025 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-logs\") pod \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.979071 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-nova-metadata-tls-certs\") pod \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.979248 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-combined-ca-bundle\") pod \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\" (UID: \"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8\") " Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.979542 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-logs" (OuterVolumeSpecName: "logs") pod "a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" (UID: "a65cbc84-6036-46d8-b2fc-37d15b7c7fa8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.980222 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:40 crc kubenswrapper[4715]: I1009 08:05:40.994012 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-kube-api-access-vhckw" (OuterVolumeSpecName: "kube-api-access-vhckw") pod "a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" (UID: "a65cbc84-6036-46d8-b2fc-37d15b7c7fa8"). InnerVolumeSpecName "kube-api-access-vhckw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.016890 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-config-data" (OuterVolumeSpecName: "config-data") pod "a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" (UID: "a65cbc84-6036-46d8-b2fc-37d15b7c7fa8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.027333 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" (UID: "a65cbc84-6036-46d8-b2fc-37d15b7c7fa8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.051315 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" (UID: "a65cbc84-6036-46d8-b2fc-37d15b7c7fa8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.081580 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.081632 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhckw\" (UniqueName: \"kubernetes.io/projected/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-kube-api-access-vhckw\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.081646 4715 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.081658 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.220742 4715 generic.go:334] "Generic (PLEG): container finished" podID="a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" containerID="3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972" exitCode=0 Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.220783 4715 generic.go:334] "Generic (PLEG): container finished" podID="a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" containerID="b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc" exitCode=143 Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.220814 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.220847 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8","Type":"ContainerDied","Data":"3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972"} Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.220903 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8","Type":"ContainerDied","Data":"b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc"} Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.220918 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a65cbc84-6036-46d8-b2fc-37d15b7c7fa8","Type":"ContainerDied","Data":"c30b53df774abda8de6495ff4e5e6fb4939703efa14b4d4c9deb69ca0f403607"} Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.220938 4715 scope.go:117] "RemoveContainer" containerID="3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.221714 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.257000 4715 scope.go:117] "RemoveContainer" containerID="b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.262512 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.272101 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.282816 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:41 crc kubenswrapper[4715]: E1009 08:05:41.283244 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" containerName="nova-metadata-metadata" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.283262 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" containerName="nova-metadata-metadata" Oct 09 08:05:41 crc kubenswrapper[4715]: E1009 08:05:41.283296 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" containerName="nova-metadata-log" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.283304 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" containerName="nova-metadata-log" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.283492 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" containerName="nova-metadata-log" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.283508 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" containerName="nova-metadata-metadata" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.284582 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.289989 4715 scope.go:117] "RemoveContainer" containerID="3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.290345 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 08:05:41 crc kubenswrapper[4715]: E1009 08:05:41.290562 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972\": container with ID starting with 3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972 not found: ID does not exist" containerID="3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.290623 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972"} err="failed to get container status \"3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972\": rpc error: code = NotFound desc = could not find container \"3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972\": container with ID starting with 3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972 not found: ID does not exist" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.290647 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.290656 4715 scope.go:117] "RemoveContainer" containerID="b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc" Oct 09 08:05:41 crc kubenswrapper[4715]: E1009 08:05:41.291045 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc\": container with ID starting with b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc not found: ID does not exist" containerID="b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.291086 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc"} err="failed to get container status \"b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc\": rpc error: code = NotFound desc = could not find container \"b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc\": container with ID starting with b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc not found: ID does not exist" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.291106 4715 scope.go:117] "RemoveContainer" containerID="3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.291338 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972"} err="failed to get container status \"3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972\": rpc error: code = NotFound desc = could not find container \"3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972\": container with ID starting with 3a2063e4f36181d28f57da779c8ffab6cd0ad598633429c6c34fd25a33e8d972 not found: ID does not exist" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.291368 4715 scope.go:117] "RemoveContainer" containerID="b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.291606 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc"} err="failed to get container status \"b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc\": rpc error: code = NotFound desc = could not find container \"b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc\": container with ID starting with b7c33db2200908179f27754fd5aabb749fb3a77d333a829d98d91afb171e8acc not found: ID does not exist" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.301671 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.386240 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv5n2\" (UniqueName: \"kubernetes.io/projected/377c1e43-3538-413a-9144-85708016acca-kube-api-access-kv5n2\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.386341 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.386496 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-config-data\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.386609 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.386767 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/377c1e43-3538-413a-9144-85708016acca-logs\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.488519 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.488582 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-config-data\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.488617 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.488669 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/377c1e43-3538-413a-9144-85708016acca-logs\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.488703 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv5n2\" (UniqueName: \"kubernetes.io/projected/377c1e43-3538-413a-9144-85708016acca-kube-api-access-kv5n2\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.489560 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/377c1e43-3538-413a-9144-85708016acca-logs\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.494402 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.494478 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-config-data\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.494629 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.504979 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv5n2\" (UniqueName: \"kubernetes.io/projected/377c1e43-3538-413a-9144-85708016acca-kube-api-access-kv5n2\") pod \"nova-metadata-0\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " pod="openstack/nova-metadata-0" Oct 09 08:05:41 crc kubenswrapper[4715]: E1009 08:05:41.514191 4715 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 08:05:41 crc kubenswrapper[4715]: E1009 08:05:41.515840 4715 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 08:05:41 crc kubenswrapper[4715]: E1009 08:05:41.517377 4715 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 09 08:05:41 crc kubenswrapper[4715]: E1009 08:05:41.517411 4715 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5eee7419-a80a-489f-8f06-7b4188803d03" containerName="nova-scheduler-scheduler" Oct 09 08:05:41 crc kubenswrapper[4715]: I1009 08:05:41.610638 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:05:42 crc kubenswrapper[4715]: I1009 08:05:42.108367 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:05:42 crc kubenswrapper[4715]: I1009 08:05:42.148410 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a65cbc84-6036-46d8-b2fc-37d15b7c7fa8" path="/var/lib/kubelet/pods/a65cbc84-6036-46d8-b2fc-37d15b7c7fa8/volumes" Oct 09 08:05:42 crc kubenswrapper[4715]: I1009 08:05:42.236261 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"377c1e43-3538-413a-9144-85708016acca","Type":"ContainerStarted","Data":"656cfdd1759beac29f25cd577a904da4f2144cbd0a77ae52506738ce3d4bbd7b"} Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.150904 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.247576 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"377c1e43-3538-413a-9144-85708016acca","Type":"ContainerStarted","Data":"31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125"} Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.248681 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"377c1e43-3538-413a-9144-85708016acca","Type":"ContainerStarted","Data":"325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec"} Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.249477 4715 generic.go:334] "Generic (PLEG): container finished" podID="5eee7419-a80a-489f-8f06-7b4188803d03" containerID="971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9" exitCode=0 Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.249710 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eee7419-a80a-489f-8f06-7b4188803d03","Type":"ContainerDied","Data":"971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9"} Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.249807 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5eee7419-a80a-489f-8f06-7b4188803d03","Type":"ContainerDied","Data":"064c1a505cf6dd1135d38ad964360da231adc7ab70710b005399a6c8dd6e98db"} Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.249903 4715 scope.go:117] "RemoveContainer" containerID="971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.250070 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.257655 4715 generic.go:334] "Generic (PLEG): container finished" podID="b321002e-eb01-4e98-902e-7e3848db47a3" containerID="ceb324ebe44a15b3b8c6819efdd0519a27f9737ffcafbca1335183e06e183b8f" exitCode=0 Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.257695 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b321002e-eb01-4e98-902e-7e3848db47a3","Type":"ContainerDied","Data":"ceb324ebe44a15b3b8c6819efdd0519a27f9737ffcafbca1335183e06e183b8f"} Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.280981 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.281733 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.281718659 podStartE2EDuration="2.281718659s" podCreationTimestamp="2025-10-09 08:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:05:43.264908474 +0000 UTC m=+1173.957712492" watchObservedRunningTime="2025-10-09 08:05:43.281718659 +0000 UTC m=+1173.974522657" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.286187 4715 scope.go:117] "RemoveContainer" containerID="971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9" Oct 09 08:05:43 crc kubenswrapper[4715]: E1009 08:05:43.286630 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9\": container with ID starting with 971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9 not found: ID does not exist" containerID="971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.286663 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9"} err="failed to get container status \"971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9\": rpc error: code = NotFound desc = could not find container \"971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9\": container with ID starting with 971460d61879299bbc0fbcf3e3eca3c3b5e18b6f138a7ec7888acff4180083a9 not found: ID does not exist" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.323474 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rltb7\" (UniqueName: \"kubernetes.io/projected/5eee7419-a80a-489f-8f06-7b4188803d03-kube-api-access-rltb7\") pod \"5eee7419-a80a-489f-8f06-7b4188803d03\" (UID: \"5eee7419-a80a-489f-8f06-7b4188803d03\") " Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.324342 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee7419-a80a-489f-8f06-7b4188803d03-combined-ca-bundle\") pod \"5eee7419-a80a-489f-8f06-7b4188803d03\" (UID: \"5eee7419-a80a-489f-8f06-7b4188803d03\") " Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.324624 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b321002e-eb01-4e98-902e-7e3848db47a3-combined-ca-bundle\") pod \"b321002e-eb01-4e98-902e-7e3848db47a3\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.324668 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee7419-a80a-489f-8f06-7b4188803d03-config-data\") pod \"5eee7419-a80a-489f-8f06-7b4188803d03\" (UID: \"5eee7419-a80a-489f-8f06-7b4188803d03\") " Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.330349 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eee7419-a80a-489f-8f06-7b4188803d03-kube-api-access-rltb7" (OuterVolumeSpecName: "kube-api-access-rltb7") pod "5eee7419-a80a-489f-8f06-7b4188803d03" (UID: "5eee7419-a80a-489f-8f06-7b4188803d03"). InnerVolumeSpecName "kube-api-access-rltb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.349884 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eee7419-a80a-489f-8f06-7b4188803d03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5eee7419-a80a-489f-8f06-7b4188803d03" (UID: "5eee7419-a80a-489f-8f06-7b4188803d03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.354671 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eee7419-a80a-489f-8f06-7b4188803d03-config-data" (OuterVolumeSpecName: "config-data") pod "5eee7419-a80a-489f-8f06-7b4188803d03" (UID: "5eee7419-a80a-489f-8f06-7b4188803d03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.361230 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b321002e-eb01-4e98-902e-7e3848db47a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b321002e-eb01-4e98-902e-7e3848db47a3" (UID: "b321002e-eb01-4e98-902e-7e3848db47a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.426852 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b321002e-eb01-4e98-902e-7e3848db47a3-config-data\") pod \"b321002e-eb01-4e98-902e-7e3848db47a3\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.426930 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrqfw\" (UniqueName: \"kubernetes.io/projected/b321002e-eb01-4e98-902e-7e3848db47a3-kube-api-access-qrqfw\") pod \"b321002e-eb01-4e98-902e-7e3848db47a3\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.427228 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b321002e-eb01-4e98-902e-7e3848db47a3-logs\") pod \"b321002e-eb01-4e98-902e-7e3848db47a3\" (UID: \"b321002e-eb01-4e98-902e-7e3848db47a3\") " Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.427695 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b321002e-eb01-4e98-902e-7e3848db47a3-logs" (OuterVolumeSpecName: "logs") pod "b321002e-eb01-4e98-902e-7e3848db47a3" (UID: "b321002e-eb01-4e98-902e-7e3848db47a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.427768 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b321002e-eb01-4e98-902e-7e3848db47a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.427890 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eee7419-a80a-489f-8f06-7b4188803d03-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.427967 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rltb7\" (UniqueName: \"kubernetes.io/projected/5eee7419-a80a-489f-8f06-7b4188803d03-kube-api-access-rltb7\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.428043 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eee7419-a80a-489f-8f06-7b4188803d03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.431484 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b321002e-eb01-4e98-902e-7e3848db47a3-kube-api-access-qrqfw" (OuterVolumeSpecName: "kube-api-access-qrqfw") pod "b321002e-eb01-4e98-902e-7e3848db47a3" (UID: "b321002e-eb01-4e98-902e-7e3848db47a3"). InnerVolumeSpecName "kube-api-access-qrqfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.454684 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b321002e-eb01-4e98-902e-7e3848db47a3-config-data" (OuterVolumeSpecName: "config-data") pod "b321002e-eb01-4e98-902e-7e3848db47a3" (UID: "b321002e-eb01-4e98-902e-7e3848db47a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.529641 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b321002e-eb01-4e98-902e-7e3848db47a3-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.529682 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b321002e-eb01-4e98-902e-7e3848db47a3-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.529692 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrqfw\" (UniqueName: \"kubernetes.io/projected/b321002e-eb01-4e98-902e-7e3848db47a3-kube-api-access-qrqfw\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.586502 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.596332 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.610779 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:05:43 crc kubenswrapper[4715]: E1009 08:05:43.611250 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b321002e-eb01-4e98-902e-7e3848db47a3" containerName="nova-api-api" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.611275 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b321002e-eb01-4e98-902e-7e3848db47a3" containerName="nova-api-api" Oct 09 08:05:43 crc kubenswrapper[4715]: E1009 08:05:43.611293 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eee7419-a80a-489f-8f06-7b4188803d03" containerName="nova-scheduler-scheduler" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.611302 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eee7419-a80a-489f-8f06-7b4188803d03" containerName="nova-scheduler-scheduler" Oct 09 08:05:43 crc kubenswrapper[4715]: E1009 08:05:43.611348 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b321002e-eb01-4e98-902e-7e3848db47a3" containerName="nova-api-log" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.611356 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b321002e-eb01-4e98-902e-7e3848db47a3" containerName="nova-api-log" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.611580 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b321002e-eb01-4e98-902e-7e3848db47a3" containerName="nova-api-api" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.611604 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b321002e-eb01-4e98-902e-7e3848db47a3" containerName="nova-api-log" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.611621 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eee7419-a80a-489f-8f06-7b4188803d03" containerName="nova-scheduler-scheduler" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.612386 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.615484 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.622219 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.640331 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f4acd0-b524-4d1c-a245-6683e14aec4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.640898 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f4acd0-b524-4d1c-a245-6683e14aec4c-config-data\") pod \"nova-scheduler-0\" (UID: \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.640930 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86p4d\" (UniqueName: \"kubernetes.io/projected/e6f4acd0-b524-4d1c-a245-6683e14aec4c-kube-api-access-86p4d\") pod \"nova-scheduler-0\" (UID: \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.743567 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f4acd0-b524-4d1c-a245-6683e14aec4c-config-data\") pod \"nova-scheduler-0\" (UID: \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.743688 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86p4d\" (UniqueName: \"kubernetes.io/projected/e6f4acd0-b524-4d1c-a245-6683e14aec4c-kube-api-access-86p4d\") pod \"nova-scheduler-0\" (UID: \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.744010 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f4acd0-b524-4d1c-a245-6683e14aec4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.749036 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f4acd0-b524-4d1c-a245-6683e14aec4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.749051 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f4acd0-b524-4d1c-a245-6683e14aec4c-config-data\") pod \"nova-scheduler-0\" (UID: \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.763860 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86p4d\" (UniqueName: \"kubernetes.io/projected/e6f4acd0-b524-4d1c-a245-6683e14aec4c-kube-api-access-86p4d\") pod \"nova-scheduler-0\" (UID: \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\") " pod="openstack/nova-scheduler-0" Oct 09 08:05:43 crc kubenswrapper[4715]: I1009 08:05:43.935522 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.158733 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eee7419-a80a-489f-8f06-7b4188803d03" path="/var/lib/kubelet/pods/5eee7419-a80a-489f-8f06-7b4188803d03/volumes" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.268835 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.268822 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b321002e-eb01-4e98-902e-7e3848db47a3","Type":"ContainerDied","Data":"386dbaa291f4870c32ccad6d2a0a4cbc6a8ab874c418a5dfc07ec4064223e735"} Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.269014 4715 scope.go:117] "RemoveContainer" containerID="ceb324ebe44a15b3b8c6819efdd0519a27f9737ffcafbca1335183e06e183b8f" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.294480 4715 scope.go:117] "RemoveContainer" containerID="a0ea2d1fb74d6ffa353f561d3261ce341ddec99a826e95b64e2dbbe22e3aa520" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.301155 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.312723 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.320570 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.325001 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.327731 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.338890 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.359204 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.359273 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mksdf\" (UniqueName: \"kubernetes.io/projected/05e9f66e-c78b-4506-b4d1-93c82402efdc-kube-api-access-mksdf\") pod \"nova-api-0\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.359392 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-config-data\") pod \"nova-api-0\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.359547 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e9f66e-c78b-4506-b4d1-93c82402efdc-logs\") pod \"nova-api-0\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: W1009 08:05:44.381721 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6f4acd0_b524_4d1c_a245_6683e14aec4c.slice/crio-c6ae8ceb02c321c068fbee9480e434a44e606b4cc616d2116af9092cc444a0fd WatchSource:0}: Error finding container c6ae8ceb02c321c068fbee9480e434a44e606b4cc616d2116af9092cc444a0fd: Status 404 returned error can't find the container with id c6ae8ceb02c321c068fbee9480e434a44e606b4cc616d2116af9092cc444a0fd Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.386961 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.460069 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-config-data\") pod \"nova-api-0\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.460138 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e9f66e-c78b-4506-b4d1-93c82402efdc-logs\") pod \"nova-api-0\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.460204 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.460232 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mksdf\" (UniqueName: \"kubernetes.io/projected/05e9f66e-c78b-4506-b4d1-93c82402efdc-kube-api-access-mksdf\") pod \"nova-api-0\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.461008 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e9f66e-c78b-4506-b4d1-93c82402efdc-logs\") pod \"nova-api-0\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.465963 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.466000 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-config-data\") pod \"nova-api-0\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.477178 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mksdf\" (UniqueName: \"kubernetes.io/projected/05e9f66e-c78b-4506-b4d1-93c82402efdc-kube-api-access-mksdf\") pod \"nova-api-0\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " pod="openstack/nova-api-0" Oct 09 08:05:44 crc kubenswrapper[4715]: I1009 08:05:44.564011 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:05:45 crc kubenswrapper[4715]: W1009 08:05:45.019835 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05e9f66e_c78b_4506_b4d1_93c82402efdc.slice/crio-0431e6f11bae145045463e3a59df81f14b8994e473b3d88d68b9ca691242cc6c WatchSource:0}: Error finding container 0431e6f11bae145045463e3a59df81f14b8994e473b3d88d68b9ca691242cc6c: Status 404 returned error can't find the container with id 0431e6f11bae145045463e3a59df81f14b8994e473b3d88d68b9ca691242cc6c Oct 09 08:05:45 crc kubenswrapper[4715]: I1009 08:05:45.020681 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:05:45 crc kubenswrapper[4715]: I1009 08:05:45.280624 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6f4acd0-b524-4d1c-a245-6683e14aec4c","Type":"ContainerStarted","Data":"424f0e7059f488402a98735f472450d172266c66091300b7c178e878de33a243"} Oct 09 08:05:45 crc kubenswrapper[4715]: I1009 08:05:45.280671 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6f4acd0-b524-4d1c-a245-6683e14aec4c","Type":"ContainerStarted","Data":"c6ae8ceb02c321c068fbee9480e434a44e606b4cc616d2116af9092cc444a0fd"} Oct 09 08:05:45 crc kubenswrapper[4715]: I1009 08:05:45.284168 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"05e9f66e-c78b-4506-b4d1-93c82402efdc","Type":"ContainerStarted","Data":"f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0"} Oct 09 08:05:45 crc kubenswrapper[4715]: I1009 08:05:45.284210 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"05e9f66e-c78b-4506-b4d1-93c82402efdc","Type":"ContainerStarted","Data":"0431e6f11bae145045463e3a59df81f14b8994e473b3d88d68b9ca691242cc6c"} Oct 09 08:05:45 crc kubenswrapper[4715]: I1009 08:05:45.300119 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.300096387 podStartE2EDuration="2.300096387s" podCreationTimestamp="2025-10-09 08:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:05:45.29570635 +0000 UTC m=+1175.988510358" watchObservedRunningTime="2025-10-09 08:05:45.300096387 +0000 UTC m=+1175.992900395" Oct 09 08:05:46 crc kubenswrapper[4715]: I1009 08:05:46.149625 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b321002e-eb01-4e98-902e-7e3848db47a3" path="/var/lib/kubelet/pods/b321002e-eb01-4e98-902e-7e3848db47a3/volumes" Oct 09 08:05:46 crc kubenswrapper[4715]: I1009 08:05:46.293562 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"05e9f66e-c78b-4506-b4d1-93c82402efdc","Type":"ContainerStarted","Data":"7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc"} Oct 09 08:05:46 crc kubenswrapper[4715]: I1009 08:05:46.312080 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.312060142 podStartE2EDuration="2.312060142s" podCreationTimestamp="2025-10-09 08:05:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:05:46.309451286 +0000 UTC m=+1177.002255294" watchObservedRunningTime="2025-10-09 08:05:46.312060142 +0000 UTC m=+1177.004864160" Oct 09 08:05:46 crc kubenswrapper[4715]: I1009 08:05:46.611636 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 08:05:46 crc kubenswrapper[4715]: I1009 08:05:46.611705 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 08:05:46 crc kubenswrapper[4715]: I1009 08:05:46.753892 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:05:46 crc kubenswrapper[4715]: I1009 08:05:46.753972 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:05:48 crc kubenswrapper[4715]: I1009 08:05:48.679729 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 09 08:05:48 crc kubenswrapper[4715]: I1009 08:05:48.935925 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 08:05:50 crc kubenswrapper[4715]: I1009 08:05:50.160597 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 08:05:51 crc kubenswrapper[4715]: I1009 08:05:51.611541 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 08:05:51 crc kubenswrapper[4715]: I1009 08:05:51.611622 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 08:05:52 crc kubenswrapper[4715]: I1009 08:05:52.624640 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="377c1e43-3538-413a-9144-85708016acca" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 08:05:52 crc kubenswrapper[4715]: I1009 08:05:52.624669 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="377c1e43-3538-413a-9144-85708016acca" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 08:05:53 crc kubenswrapper[4715]: I1009 08:05:53.568549 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 08:05:53 crc kubenswrapper[4715]: I1009 08:05:53.569090 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c264ae36-c2f5-46f7-ae4f-9d4464b2a57f" containerName="kube-state-metrics" containerID="cri-o://dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c" gracePeriod=30 Oct 09 08:05:53 crc kubenswrapper[4715]: I1009 08:05:53.936581 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 08:05:53 crc kubenswrapper[4715]: I1009 08:05:53.971332 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.072462 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.234538 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7f7t\" (UniqueName: \"kubernetes.io/projected/c264ae36-c2f5-46f7-ae4f-9d4464b2a57f-kube-api-access-r7f7t\") pod \"c264ae36-c2f5-46f7-ae4f-9d4464b2a57f\" (UID: \"c264ae36-c2f5-46f7-ae4f-9d4464b2a57f\") " Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.240691 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c264ae36-c2f5-46f7-ae4f-9d4464b2a57f-kube-api-access-r7f7t" (OuterVolumeSpecName: "kube-api-access-r7f7t") pod "c264ae36-c2f5-46f7-ae4f-9d4464b2a57f" (UID: "c264ae36-c2f5-46f7-ae4f-9d4464b2a57f"). InnerVolumeSpecName "kube-api-access-r7f7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.337399 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7f7t\" (UniqueName: \"kubernetes.io/projected/c264ae36-c2f5-46f7-ae4f-9d4464b2a57f-kube-api-access-r7f7t\") on node \"crc\" DevicePath \"\"" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.369628 4715 generic.go:334] "Generic (PLEG): container finished" podID="c264ae36-c2f5-46f7-ae4f-9d4464b2a57f" containerID="dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c" exitCode=2 Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.369679 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c264ae36-c2f5-46f7-ae4f-9d4464b2a57f","Type":"ContainerDied","Data":"dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c"} Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.369716 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.369743 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c264ae36-c2f5-46f7-ae4f-9d4464b2a57f","Type":"ContainerDied","Data":"6ce20347f60d0fe26642e9d88b4c04f3dad04b1a0dd9e7db258e360a5ac59907"} Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.369772 4715 scope.go:117] "RemoveContainer" containerID="dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.392766 4715 scope.go:117] "RemoveContainer" containerID="dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c" Oct 09 08:05:54 crc kubenswrapper[4715]: E1009 08:05:54.396323 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c\": container with ID starting with dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c not found: ID does not exist" containerID="dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.396458 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c"} err="failed to get container status \"dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c\": rpc error: code = NotFound desc = could not find container \"dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c\": container with ID starting with dc51de515e8939922eba285d210b67814e95a1d30d3ad2733c0e3a2e7881d56c not found: ID does not exist" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.401968 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.444534 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.471000 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.485523 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 08:05:54 crc kubenswrapper[4715]: E1009 08:05:54.486004 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c264ae36-c2f5-46f7-ae4f-9d4464b2a57f" containerName="kube-state-metrics" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.486020 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="c264ae36-c2f5-46f7-ae4f-9d4464b2a57f" containerName="kube-state-metrics" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.486214 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="c264ae36-c2f5-46f7-ae4f-9d4464b2a57f" containerName="kube-state-metrics" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.486985 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.489472 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.489999 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.494339 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.569743 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.569793 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.643702 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8c92457-ab26-4a48-b7e1-094eac8532c7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8c92457-ab26-4a48-b7e1-094eac8532c7\") " pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.643775 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8c92457-ab26-4a48-b7e1-094eac8532c7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8c92457-ab26-4a48-b7e1-094eac8532c7\") " pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.643901 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nt99\" (UniqueName: \"kubernetes.io/projected/f8c92457-ab26-4a48-b7e1-094eac8532c7-kube-api-access-9nt99\") pod \"kube-state-metrics-0\" (UID: \"f8c92457-ab26-4a48-b7e1-094eac8532c7\") " pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.644040 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c92457-ab26-4a48-b7e1-094eac8532c7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8c92457-ab26-4a48-b7e1-094eac8532c7\") " pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.746007 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8c92457-ab26-4a48-b7e1-094eac8532c7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8c92457-ab26-4a48-b7e1-094eac8532c7\") " pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.746092 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8c92457-ab26-4a48-b7e1-094eac8532c7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8c92457-ab26-4a48-b7e1-094eac8532c7\") " pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.746156 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nt99\" (UniqueName: \"kubernetes.io/projected/f8c92457-ab26-4a48-b7e1-094eac8532c7-kube-api-access-9nt99\") pod \"kube-state-metrics-0\" (UID: \"f8c92457-ab26-4a48-b7e1-094eac8532c7\") " pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.746226 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c92457-ab26-4a48-b7e1-094eac8532c7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8c92457-ab26-4a48-b7e1-094eac8532c7\") " pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.750218 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8c92457-ab26-4a48-b7e1-094eac8532c7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8c92457-ab26-4a48-b7e1-094eac8532c7\") " pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.750386 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8c92457-ab26-4a48-b7e1-094eac8532c7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8c92457-ab26-4a48-b7e1-094eac8532c7\") " pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.751044 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8c92457-ab26-4a48-b7e1-094eac8532c7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8c92457-ab26-4a48-b7e1-094eac8532c7\") " pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.768676 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nt99\" (UniqueName: \"kubernetes.io/projected/f8c92457-ab26-4a48-b7e1-094eac8532c7-kube-api-access-9nt99\") pod \"kube-state-metrics-0\" (UID: \"f8c92457-ab26-4a48-b7e1-094eac8532c7\") " pod="openstack/kube-state-metrics-0" Oct 09 08:05:54 crc kubenswrapper[4715]: I1009 08:05:54.808403 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 09 08:05:55 crc kubenswrapper[4715]: I1009 08:05:55.298870 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 09 08:05:55 crc kubenswrapper[4715]: W1009 08:05:55.308143 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8c92457_ab26_4a48_b7e1_094eac8532c7.slice/crio-e7b097f128d139cb6b95e6914002fb2addba158b275fc44dd30e675f2d37dc72 WatchSource:0}: Error finding container e7b097f128d139cb6b95e6914002fb2addba158b275fc44dd30e675f2d37dc72: Status 404 returned error can't find the container with id e7b097f128d139cb6b95e6914002fb2addba158b275fc44dd30e675f2d37dc72 Oct 09 08:05:55 crc kubenswrapper[4715]: I1009 08:05:55.382188 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8c92457-ab26-4a48-b7e1-094eac8532c7","Type":"ContainerStarted","Data":"e7b097f128d139cb6b95e6914002fb2addba158b275fc44dd30e675f2d37dc72"} Oct 09 08:05:55 crc kubenswrapper[4715]: I1009 08:05:55.477598 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:05:55 crc kubenswrapper[4715]: I1009 08:05:55.478318 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="sg-core" containerID="cri-o://37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db" gracePeriod=30 Oct 09 08:05:55 crc kubenswrapper[4715]: I1009 08:05:55.478369 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="ceilometer-notification-agent" containerID="cri-o://f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd" gracePeriod=30 Oct 09 08:05:55 crc kubenswrapper[4715]: I1009 08:05:55.478369 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="proxy-httpd" containerID="cri-o://4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60" gracePeriod=30 Oct 09 08:05:55 crc kubenswrapper[4715]: I1009 08:05:55.479023 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="ceilometer-central-agent" containerID="cri-o://ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8" gracePeriod=30 Oct 09 08:05:55 crc kubenswrapper[4715]: I1009 08:05:55.611638 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="05e9f66e-c78b-4506-b4d1-93c82402efdc" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 08:05:55 crc kubenswrapper[4715]: I1009 08:05:55.612070 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="05e9f66e-c78b-4506-b4d1-93c82402efdc" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.197:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 08:05:56 crc kubenswrapper[4715]: I1009 08:05:56.147297 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c264ae36-c2f5-46f7-ae4f-9d4464b2a57f" path="/var/lib/kubelet/pods/c264ae36-c2f5-46f7-ae4f-9d4464b2a57f/volumes" Oct 09 08:05:56 crc kubenswrapper[4715]: I1009 08:05:56.396040 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8c92457-ab26-4a48-b7e1-094eac8532c7","Type":"ContainerStarted","Data":"5cf273fc3e882a725ed63e2572fc1c7cda613ce90feb08a2a7be2d9e92ed8f15"} Oct 09 08:05:56 crc kubenswrapper[4715]: I1009 08:05:56.399538 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 09 08:05:56 crc kubenswrapper[4715]: I1009 08:05:56.404031 4715 generic.go:334] "Generic (PLEG): container finished" podID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerID="4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60" exitCode=0 Oct 09 08:05:56 crc kubenswrapper[4715]: I1009 08:05:56.404059 4715 generic.go:334] "Generic (PLEG): container finished" podID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerID="37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db" exitCode=2 Oct 09 08:05:56 crc kubenswrapper[4715]: I1009 08:05:56.404068 4715 generic.go:334] "Generic (PLEG): container finished" podID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerID="ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8" exitCode=0 Oct 09 08:05:56 crc kubenswrapper[4715]: I1009 08:05:56.404088 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b364ad0f-f451-4668-b657-23e9128a0b5f","Type":"ContainerDied","Data":"4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60"} Oct 09 08:05:56 crc kubenswrapper[4715]: I1009 08:05:56.404121 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b364ad0f-f451-4668-b657-23e9128a0b5f","Type":"ContainerDied","Data":"37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db"} Oct 09 08:05:56 crc kubenswrapper[4715]: I1009 08:05:56.404131 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b364ad0f-f451-4668-b657-23e9128a0b5f","Type":"ContainerDied","Data":"ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8"} Oct 09 08:05:56 crc kubenswrapper[4715]: I1009 08:05:56.422544 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.9826676540000001 podStartE2EDuration="2.42252503s" podCreationTimestamp="2025-10-09 08:05:54 +0000 UTC" firstStartedPulling="2025-10-09 08:05:55.31051586 +0000 UTC m=+1186.003319868" lastFinishedPulling="2025-10-09 08:05:55.750373246 +0000 UTC m=+1186.443177244" observedRunningTime="2025-10-09 08:05:56.41662665 +0000 UTC m=+1187.109430658" watchObservedRunningTime="2025-10-09 08:05:56.42252503 +0000 UTC m=+1187.115329038" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.323706 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.442896 4715 generic.go:334] "Generic (PLEG): container finished" podID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerID="f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd" exitCode=0 Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.443173 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b364ad0f-f451-4668-b657-23e9128a0b5f","Type":"ContainerDied","Data":"f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd"} Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.443198 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b364ad0f-f451-4668-b657-23e9128a0b5f","Type":"ContainerDied","Data":"db39cf168ce987b5aa1dfcd5f51803b5deb2bcb5ff109b4f64888473fa7c959a"} Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.443215 4715 scope.go:117] "RemoveContainer" containerID="4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.443341 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.446087 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-scripts\") pod \"b364ad0f-f451-4668-b657-23e9128a0b5f\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.446338 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-sg-core-conf-yaml\") pod \"b364ad0f-f451-4668-b657-23e9128a0b5f\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.446381 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-config-data\") pod \"b364ad0f-f451-4668-b657-23e9128a0b5f\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.446539 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwmfg\" (UniqueName: \"kubernetes.io/projected/b364ad0f-f451-4668-b657-23e9128a0b5f-kube-api-access-pwmfg\") pod \"b364ad0f-f451-4668-b657-23e9128a0b5f\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.446574 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b364ad0f-f451-4668-b657-23e9128a0b5f-run-httpd\") pod \"b364ad0f-f451-4668-b657-23e9128a0b5f\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.446595 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b364ad0f-f451-4668-b657-23e9128a0b5f-log-httpd\") pod \"b364ad0f-f451-4668-b657-23e9128a0b5f\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.446623 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-combined-ca-bundle\") pod \"b364ad0f-f451-4668-b657-23e9128a0b5f\" (UID: \"b364ad0f-f451-4668-b657-23e9128a0b5f\") " Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.448502 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b364ad0f-f451-4668-b657-23e9128a0b5f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b364ad0f-f451-4668-b657-23e9128a0b5f" (UID: "b364ad0f-f451-4668-b657-23e9128a0b5f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.448715 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b364ad0f-f451-4668-b657-23e9128a0b5f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b364ad0f-f451-4668-b657-23e9128a0b5f" (UID: "b364ad0f-f451-4668-b657-23e9128a0b5f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.452760 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-scripts" (OuterVolumeSpecName: "scripts") pod "b364ad0f-f451-4668-b657-23e9128a0b5f" (UID: "b364ad0f-f451-4668-b657-23e9128a0b5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.456138 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b364ad0f-f451-4668-b657-23e9128a0b5f-kube-api-access-pwmfg" (OuterVolumeSpecName: "kube-api-access-pwmfg") pod "b364ad0f-f451-4668-b657-23e9128a0b5f" (UID: "b364ad0f-f451-4668-b657-23e9128a0b5f"). InnerVolumeSpecName "kube-api-access-pwmfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.464512 4715 scope.go:117] "RemoveContainer" containerID="37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.482057 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b364ad0f-f451-4668-b657-23e9128a0b5f" (UID: "b364ad0f-f451-4668-b657-23e9128a0b5f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.539498 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b364ad0f-f451-4668-b657-23e9128a0b5f" (UID: "b364ad0f-f451-4668-b657-23e9128a0b5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.549784 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.549817 4715 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.549828 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwmfg\" (UniqueName: \"kubernetes.io/projected/b364ad0f-f451-4668-b657-23e9128a0b5f-kube-api-access-pwmfg\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.549837 4715 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b364ad0f-f451-4668-b657-23e9128a0b5f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.549848 4715 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b364ad0f-f451-4668-b657-23e9128a0b5f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.549855 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.551982 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-config-data" (OuterVolumeSpecName: "config-data") pod "b364ad0f-f451-4668-b657-23e9128a0b5f" (UID: "b364ad0f-f451-4668-b657-23e9128a0b5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.568513 4715 scope.go:117] "RemoveContainer" containerID="f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.587284 4715 scope.go:117] "RemoveContainer" containerID="ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.606276 4715 scope.go:117] "RemoveContainer" containerID="4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60" Oct 09 08:06:00 crc kubenswrapper[4715]: E1009 08:06:00.606958 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60\": container with ID starting with 4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60 not found: ID does not exist" containerID="4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.606995 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60"} err="failed to get container status \"4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60\": rpc error: code = NotFound desc = could not find container \"4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60\": container with ID starting with 4135466cb4a9563e69be24a3f65fc8b195d40b02bedb2667c69b924773b5be60 not found: ID does not exist" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.607016 4715 scope.go:117] "RemoveContainer" containerID="37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db" Oct 09 08:06:00 crc kubenswrapper[4715]: E1009 08:06:00.607297 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db\": container with ID starting with 37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db not found: ID does not exist" containerID="37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.607338 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db"} err="failed to get container status \"37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db\": rpc error: code = NotFound desc = could not find container \"37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db\": container with ID starting with 37a6d79682203669cef3366b4f04a177609251add88d72fe723b2aaa5e0013db not found: ID does not exist" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.607366 4715 scope.go:117] "RemoveContainer" containerID="f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd" Oct 09 08:06:00 crc kubenswrapper[4715]: E1009 08:06:00.607733 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd\": container with ID starting with f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd not found: ID does not exist" containerID="f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.607759 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd"} err="failed to get container status \"f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd\": rpc error: code = NotFound desc = could not find container \"f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd\": container with ID starting with f9bf11a64d9129988dec1367f607f8f8aaa7ed3f976a2e85f0846e407867f6bd not found: ID does not exist" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.607773 4715 scope.go:117] "RemoveContainer" containerID="ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8" Oct 09 08:06:00 crc kubenswrapper[4715]: E1009 08:06:00.608122 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8\": container with ID starting with ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8 not found: ID does not exist" containerID="ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.608148 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8"} err="failed to get container status \"ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8\": rpc error: code = NotFound desc = could not find container \"ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8\": container with ID starting with ed2e7100f5b18d3d952f92059773d0dde256c44bc758ee0e0d5afcdbe157d6d8 not found: ID does not exist" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.652022 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b364ad0f-f451-4668-b657-23e9128a0b5f-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.782581 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.804029 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.814309 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:00 crc kubenswrapper[4715]: E1009 08:06:00.814752 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="proxy-httpd" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.814772 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="proxy-httpd" Oct 09 08:06:00 crc kubenswrapper[4715]: E1009 08:06:00.814793 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="sg-core" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.814799 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="sg-core" Oct 09 08:06:00 crc kubenswrapper[4715]: E1009 08:06:00.814812 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="ceilometer-notification-agent" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.814819 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="ceilometer-notification-agent" Oct 09 08:06:00 crc kubenswrapper[4715]: E1009 08:06:00.814849 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="ceilometer-central-agent" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.814855 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="ceilometer-central-agent" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.815044 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="ceilometer-notification-agent" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.815068 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="sg-core" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.815084 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="ceilometer-central-agent" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.815100 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" containerName="proxy-httpd" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.817546 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.821351 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.821458 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.821604 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.821989 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.959590 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.959726 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.959778 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-scripts\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.959797 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-run-httpd\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.959814 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-config-data\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.959858 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.959985 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvhfn\" (UniqueName: \"kubernetes.io/projected/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-kube-api-access-pvhfn\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:00 crc kubenswrapper[4715]: I1009 08:06:00.960048 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-log-httpd\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.061991 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.062040 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvhfn\" (UniqueName: \"kubernetes.io/projected/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-kube-api-access-pvhfn\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.062074 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-log-httpd\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.062165 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.062225 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.062270 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-scripts\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.062289 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-run-httpd\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.062310 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-config-data\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.062767 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-log-httpd\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.062790 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-run-httpd\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.066808 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-config-data\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.067028 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.067357 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.067391 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-scripts\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.067744 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.090308 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvhfn\" (UniqueName: \"kubernetes.io/projected/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-kube-api-access-pvhfn\") pod \"ceilometer-0\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.150557 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.588368 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.617991 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.619448 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 08:06:01 crc kubenswrapper[4715]: I1009 08:06:01.626971 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 08:06:02 crc kubenswrapper[4715]: I1009 08:06:02.152260 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b364ad0f-f451-4668-b657-23e9128a0b5f" path="/var/lib/kubelet/pods/b364ad0f-f451-4668-b657-23e9128a0b5f/volumes" Oct 09 08:06:02 crc kubenswrapper[4715]: I1009 08:06:02.475308 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1","Type":"ContainerStarted","Data":"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1"} Oct 09 08:06:02 crc kubenswrapper[4715]: I1009 08:06:02.475395 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1","Type":"ContainerStarted","Data":"662a2f87a92cff032a2c1d4cd21a69eda1b0b05d6421a8d347e60547cdd11b63"} Oct 09 08:06:02 crc kubenswrapper[4715]: I1009 08:06:02.482892 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.487159 4715 generic.go:334] "Generic (PLEG): container finished" podID="9480540d-4b2a-40ea-b63b-e695c8e0a1b5" containerID="f57ca25581bd4ae4ec705d3a73279094b06babbe39783df93d6f13176b690e48" exitCode=137 Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.487232 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9480540d-4b2a-40ea-b63b-e695c8e0a1b5","Type":"ContainerDied","Data":"f57ca25581bd4ae4ec705d3a73279094b06babbe39783df93d6f13176b690e48"} Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.487767 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9480540d-4b2a-40ea-b63b-e695c8e0a1b5","Type":"ContainerDied","Data":"620cf1e6e43b698cd27faa2afe51ee7ef6693fd6deff707e1cd895229d14c29f"} Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.487785 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="620cf1e6e43b698cd27faa2afe51ee7ef6693fd6deff707e1cd895229d14c29f" Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.490048 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1","Type":"ContainerStarted","Data":"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783"} Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.543492 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.716891 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-config-data\") pod \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\" (UID: \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\") " Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.717406 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phs7g\" (UniqueName: \"kubernetes.io/projected/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-kube-api-access-phs7g\") pod \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\" (UID: \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\") " Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.717482 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-combined-ca-bundle\") pod \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\" (UID: \"9480540d-4b2a-40ea-b63b-e695c8e0a1b5\") " Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.720923 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-kube-api-access-phs7g" (OuterVolumeSpecName: "kube-api-access-phs7g") pod "9480540d-4b2a-40ea-b63b-e695c8e0a1b5" (UID: "9480540d-4b2a-40ea-b63b-e695c8e0a1b5"). InnerVolumeSpecName "kube-api-access-phs7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.742787 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9480540d-4b2a-40ea-b63b-e695c8e0a1b5" (UID: "9480540d-4b2a-40ea-b63b-e695c8e0a1b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.743637 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-config-data" (OuterVolumeSpecName: "config-data") pod "9480540d-4b2a-40ea-b63b-e695c8e0a1b5" (UID: "9480540d-4b2a-40ea-b63b-e695c8e0a1b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.819683 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phs7g\" (UniqueName: \"kubernetes.io/projected/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-kube-api-access-phs7g\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.819732 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:03 crc kubenswrapper[4715]: I1009 08:06:03.819743 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9480540d-4b2a-40ea-b63b-e695c8e0a1b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.512562 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1","Type":"ContainerStarted","Data":"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7"} Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.512956 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.537511 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.551766 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.567773 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.569356 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.569484 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.571736 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 08:06:04 crc kubenswrapper[4715]: E1009 08:06:04.572160 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9480540d-4b2a-40ea-b63b-e695c8e0a1b5" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.572182 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9480540d-4b2a-40ea-b63b-e695c8e0a1b5" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.572368 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9480540d-4b2a-40ea-b63b-e695c8e0a1b5" containerName="nova-cell1-novncproxy-novncproxy" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.573039 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.577260 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.582206 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.585365 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.586437 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.596666 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.736995 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.737147 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.737192 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.737250 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.737378 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7grd\" (UniqueName: \"kubernetes.io/projected/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-kube-api-access-t7grd\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.824172 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.838816 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7grd\" (UniqueName: \"kubernetes.io/projected/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-kube-api-access-t7grd\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.838887 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.838937 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.838967 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.839187 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.844634 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.846345 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.846780 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.849243 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.863002 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7grd\" (UniqueName: \"kubernetes.io/projected/84ca6d97-8374-4d3e-a5e7-e475bd7f89ce-kube-api-access-t7grd\") pod \"nova-cell1-novncproxy-0\" (UID: \"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:04 crc kubenswrapper[4715]: I1009 08:06:04.893476 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:05 crc kubenswrapper[4715]: W1009 08:06:05.379742 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84ca6d97_8374_4d3e_a5e7_e475bd7f89ce.slice/crio-35e9f4e888207db8770681ee677a046004ed39f745350b6577dc789740090089 WatchSource:0}: Error finding container 35e9f4e888207db8770681ee677a046004ed39f745350b6577dc789740090089: Status 404 returned error can't find the container with id 35e9f4e888207db8770681ee677a046004ed39f745350b6577dc789740090089 Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.380362 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.530464 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce","Type":"ContainerStarted","Data":"35e9f4e888207db8770681ee677a046004ed39f745350b6577dc789740090089"} Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.533670 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1","Type":"ContainerStarted","Data":"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf"} Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.534143 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.534469 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.538083 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.567610 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.134442657 podStartE2EDuration="5.567589807s" podCreationTimestamp="2025-10-09 08:06:00 +0000 UTC" firstStartedPulling="2025-10-09 08:06:01.593288502 +0000 UTC m=+1192.286092520" lastFinishedPulling="2025-10-09 08:06:05.026435662 +0000 UTC m=+1195.719239670" observedRunningTime="2025-10-09 08:06:05.559822463 +0000 UTC m=+1196.252626481" watchObservedRunningTime="2025-10-09 08:06:05.567589807 +0000 UTC m=+1196.260393815" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.740259 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gf67l"] Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.741902 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.788437 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gf67l"] Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.860674 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.861033 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.861112 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.861144 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.861221 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-config\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.861277 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trb5t\" (UniqueName: \"kubernetes.io/projected/07d5a57f-6ce3-4572-849c-baebf00831f1-kube-api-access-trb5t\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.963186 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.963259 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.963339 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.963369 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.963464 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-config\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.963514 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trb5t\" (UniqueName: \"kubernetes.io/projected/07d5a57f-6ce3-4572-849c-baebf00831f1-kube-api-access-trb5t\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.964496 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.964618 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.964673 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.965108 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.965265 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-config\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:05 crc kubenswrapper[4715]: I1009 08:06:05.989062 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trb5t\" (UniqueName: \"kubernetes.io/projected/07d5a57f-6ce3-4572-849c-baebf00831f1-kube-api-access-trb5t\") pod \"dnsmasq-dns-59cf4bdb65-gf67l\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:06 crc kubenswrapper[4715]: I1009 08:06:06.066482 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:06 crc kubenswrapper[4715]: I1009 08:06:06.148160 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9480540d-4b2a-40ea-b63b-e695c8e0a1b5" path="/var/lib/kubelet/pods/9480540d-4b2a-40ea-b63b-e695c8e0a1b5/volumes" Oct 09 08:06:06 crc kubenswrapper[4715]: I1009 08:06:06.534042 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gf67l"] Oct 09 08:06:06 crc kubenswrapper[4715]: I1009 08:06:06.545255 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"84ca6d97-8374-4d3e-a5e7-e475bd7f89ce","Type":"ContainerStarted","Data":"dae80907a3ecf988dcdb4b4bedbca462c53ff7cb9560c567a45cf74e19c90fd5"} Oct 09 08:06:06 crc kubenswrapper[4715]: I1009 08:06:06.547477 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" event={"ID":"07d5a57f-6ce3-4572-849c-baebf00831f1","Type":"ContainerStarted","Data":"a11ff4ed1a429d435aafe8d26f57fdf83a212f402bcc7adebbb6f3df5bf53a32"} Oct 09 08:06:06 crc kubenswrapper[4715]: I1009 08:06:06.567592 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.5675684260000002 podStartE2EDuration="2.567568426s" podCreationTimestamp="2025-10-09 08:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:06:06.563369495 +0000 UTC m=+1197.256173503" watchObservedRunningTime="2025-10-09 08:06:06.567568426 +0000 UTC m=+1197.260372444" Oct 09 08:06:07 crc kubenswrapper[4715]: I1009 08:06:07.565775 4715 generic.go:334] "Generic (PLEG): container finished" podID="07d5a57f-6ce3-4572-849c-baebf00831f1" containerID="8969294b93cd32dc814e9f8666314f7fad7090821c8773994867ebe225fb5b82" exitCode=0 Oct 09 08:06:07 crc kubenswrapper[4715]: I1009 08:06:07.567506 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" event={"ID":"07d5a57f-6ce3-4572-849c-baebf00831f1","Type":"ContainerDied","Data":"8969294b93cd32dc814e9f8666314f7fad7090821c8773994867ebe225fb5b82"} Oct 09 08:06:08 crc kubenswrapper[4715]: I1009 08:06:08.521780 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:08 crc kubenswrapper[4715]: I1009 08:06:08.522336 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="ceilometer-central-agent" containerID="cri-o://dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1" gracePeriod=30 Oct 09 08:06:08 crc kubenswrapper[4715]: I1009 08:06:08.522820 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="proxy-httpd" containerID="cri-o://1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf" gracePeriod=30 Oct 09 08:06:08 crc kubenswrapper[4715]: I1009 08:06:08.522876 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="sg-core" containerID="cri-o://eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7" gracePeriod=30 Oct 09 08:06:08 crc kubenswrapper[4715]: I1009 08:06:08.522912 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="ceilometer-notification-agent" containerID="cri-o://9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783" gracePeriod=30 Oct 09 08:06:08 crc kubenswrapper[4715]: I1009 08:06:08.578347 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" event={"ID":"07d5a57f-6ce3-4572-849c-baebf00831f1","Type":"ContainerStarted","Data":"84a8c94e7032c9daa84e7b4f56bed17e716c809e6e95ce1c341e0de3785cfad5"} Oct 09 08:06:08 crc kubenswrapper[4715]: I1009 08:06:08.578974 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:08 crc kubenswrapper[4715]: I1009 08:06:08.621905 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" podStartSLOduration=3.621883472 podStartE2EDuration="3.621883472s" podCreationTimestamp="2025-10-09 08:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:06:08.608122065 +0000 UTC m=+1199.300926083" watchObservedRunningTime="2025-10-09 08:06:08.621883472 +0000 UTC m=+1199.314687480" Oct 09 08:06:08 crc kubenswrapper[4715]: I1009 08:06:08.871340 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:06:08 crc kubenswrapper[4715]: I1009 08:06:08.871550 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="05e9f66e-c78b-4506-b4d1-93c82402efdc" containerName="nova-api-log" containerID="cri-o://f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0" gracePeriod=30 Oct 09 08:06:08 crc kubenswrapper[4715]: I1009 08:06:08.871968 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="05e9f66e-c78b-4506-b4d1-93c82402efdc" containerName="nova-api-api" containerID="cri-o://7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc" gracePeriod=30 Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.528983 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.594496 4715 generic.go:334] "Generic (PLEG): container finished" podID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerID="1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf" exitCode=0 Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.594536 4715 generic.go:334] "Generic (PLEG): container finished" podID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerID="eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7" exitCode=2 Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.594546 4715 generic.go:334] "Generic (PLEG): container finished" podID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerID="9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783" exitCode=0 Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.594557 4715 generic.go:334] "Generic (PLEG): container finished" podID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerID="dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1" exitCode=0 Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.594545 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1","Type":"ContainerDied","Data":"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf"} Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.594588 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.594620 4715 scope.go:117] "RemoveContainer" containerID="1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.594607 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1","Type":"ContainerDied","Data":"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7"} Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.594728 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1","Type":"ContainerDied","Data":"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783"} Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.594749 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1","Type":"ContainerDied","Data":"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1"} Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.594763 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1","Type":"ContainerDied","Data":"662a2f87a92cff032a2c1d4cd21a69eda1b0b05d6421a8d347e60547cdd11b63"} Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.599091 4715 generic.go:334] "Generic (PLEG): container finished" podID="05e9f66e-c78b-4506-b4d1-93c82402efdc" containerID="f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0" exitCode=143 Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.599132 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"05e9f66e-c78b-4506-b4d1-93c82402efdc","Type":"ContainerDied","Data":"f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0"} Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.614645 4715 scope.go:117] "RemoveContainer" containerID="eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.634029 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-scripts\") pod \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.634130 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-run-httpd\") pod \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.634158 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-ceilometer-tls-certs\") pod \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.634199 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-combined-ca-bundle\") pod \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.634243 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-sg-core-conf-yaml\") pod \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.634300 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-config-data\") pod \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.634402 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvhfn\" (UniqueName: \"kubernetes.io/projected/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-kube-api-access-pvhfn\") pod \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.634580 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-log-httpd\") pod \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\" (UID: \"7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1\") " Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.636229 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" (UID: "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.637075 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" (UID: "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.656533 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-scripts" (OuterVolumeSpecName: "scripts") pod "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" (UID: "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.656747 4715 scope.go:117] "RemoveContainer" containerID="9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.663526 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-kube-api-access-pvhfn" (OuterVolumeSpecName: "kube-api-access-pvhfn") pod "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" (UID: "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1"). InnerVolumeSpecName "kube-api-access-pvhfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.694097 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" (UID: "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.704877 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" (UID: "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.740875 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvhfn\" (UniqueName: \"kubernetes.io/projected/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-kube-api-access-pvhfn\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.740908 4715 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.740921 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.740932 4715 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.740944 4715 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.740956 4715 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.756576 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" (UID: "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.791357 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-config-data" (OuterVolumeSpecName: "config-data") pod "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" (UID: "7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.843011 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.843058 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.872710 4715 scope.go:117] "RemoveContainer" containerID="dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.894152 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.896814 4715 scope.go:117] "RemoveContainer" containerID="1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf" Oct 09 08:06:09 crc kubenswrapper[4715]: E1009 08:06:09.897344 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf\": container with ID starting with 1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf not found: ID does not exist" containerID="1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.897450 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf"} err="failed to get container status \"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf\": rpc error: code = NotFound desc = could not find container \"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf\": container with ID starting with 1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.897542 4715 scope.go:117] "RemoveContainer" containerID="eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7" Oct 09 08:06:09 crc kubenswrapper[4715]: E1009 08:06:09.898115 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7\": container with ID starting with eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7 not found: ID does not exist" containerID="eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.898163 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7"} err="failed to get container status \"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7\": rpc error: code = NotFound desc = could not find container \"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7\": container with ID starting with eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7 not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.898193 4715 scope.go:117] "RemoveContainer" containerID="9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783" Oct 09 08:06:09 crc kubenswrapper[4715]: E1009 08:06:09.898615 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783\": container with ID starting with 9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783 not found: ID does not exist" containerID="9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.898693 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783"} err="failed to get container status \"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783\": rpc error: code = NotFound desc = could not find container \"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783\": container with ID starting with 9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783 not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.898776 4715 scope.go:117] "RemoveContainer" containerID="dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1" Oct 09 08:06:09 crc kubenswrapper[4715]: E1009 08:06:09.899201 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1\": container with ID starting with dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1 not found: ID does not exist" containerID="dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.899291 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1"} err="failed to get container status \"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1\": rpc error: code = NotFound desc = could not find container \"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1\": container with ID starting with dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1 not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.899353 4715 scope.go:117] "RemoveContainer" containerID="1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.899743 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf"} err="failed to get container status \"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf\": rpc error: code = NotFound desc = could not find container \"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf\": container with ID starting with 1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.899841 4715 scope.go:117] "RemoveContainer" containerID="eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.900295 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7"} err="failed to get container status \"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7\": rpc error: code = NotFound desc = could not find container \"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7\": container with ID starting with eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7 not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.900319 4715 scope.go:117] "RemoveContainer" containerID="9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.900565 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783"} err="failed to get container status \"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783\": rpc error: code = NotFound desc = could not find container \"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783\": container with ID starting with 9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783 not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.900641 4715 scope.go:117] "RemoveContainer" containerID="dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.901021 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1"} err="failed to get container status \"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1\": rpc error: code = NotFound desc = could not find container \"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1\": container with ID starting with dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1 not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.901043 4715 scope.go:117] "RemoveContainer" containerID="1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.901378 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf"} err="failed to get container status \"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf\": rpc error: code = NotFound desc = could not find container \"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf\": container with ID starting with 1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.901480 4715 scope.go:117] "RemoveContainer" containerID="eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.901811 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7"} err="failed to get container status \"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7\": rpc error: code = NotFound desc = could not find container \"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7\": container with ID starting with eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7 not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.901844 4715 scope.go:117] "RemoveContainer" containerID="9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.902105 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783"} err="failed to get container status \"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783\": rpc error: code = NotFound desc = could not find container \"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783\": container with ID starting with 9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783 not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.902186 4715 scope.go:117] "RemoveContainer" containerID="dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.902576 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1"} err="failed to get container status \"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1\": rpc error: code = NotFound desc = could not find container \"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1\": container with ID starting with dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1 not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.902656 4715 scope.go:117] "RemoveContainer" containerID="1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.902990 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf"} err="failed to get container status \"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf\": rpc error: code = NotFound desc = could not find container \"1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf\": container with ID starting with 1f99ac7c3f88f07ed9a777389b9b45290ecbf68f71b4d64f954ff070dd7c2cdf not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.903073 4715 scope.go:117] "RemoveContainer" containerID="eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.903369 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7"} err="failed to get container status \"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7\": rpc error: code = NotFound desc = could not find container \"eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7\": container with ID starting with eef61d87bc30379006bffd0d1569889ebdfc58f6b917c1ad128a06b0d3c50fe7 not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.903404 4715 scope.go:117] "RemoveContainer" containerID="9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.903781 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783"} err="failed to get container status \"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783\": rpc error: code = NotFound desc = could not find container \"9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783\": container with ID starting with 9a47931665bda3f43bb419917d365dcf21285aa403475254f1f61d5d57a6c783 not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.903885 4715 scope.go:117] "RemoveContainer" containerID="dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.904154 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1"} err="failed to get container status \"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1\": rpc error: code = NotFound desc = could not find container \"dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1\": container with ID starting with dcfd223b439e80daed64930d14ff70ccaa8b0a9dbac78d5dab97e3dc394bf8e1 not found: ID does not exist" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.929135 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.937559 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.952805 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:09 crc kubenswrapper[4715]: E1009 08:06:09.953188 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="ceilometer-central-agent" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.953207 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="ceilometer-central-agent" Oct 09 08:06:09 crc kubenswrapper[4715]: E1009 08:06:09.953234 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="sg-core" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.953244 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="sg-core" Oct 09 08:06:09 crc kubenswrapper[4715]: E1009 08:06:09.953261 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="proxy-httpd" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.953267 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="proxy-httpd" Oct 09 08:06:09 crc kubenswrapper[4715]: E1009 08:06:09.953286 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="ceilometer-notification-agent" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.953293 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="ceilometer-notification-agent" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.953464 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="ceilometer-notification-agent" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.953492 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="proxy-httpd" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.953502 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="sg-core" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.953512 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" containerName="ceilometer-central-agent" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.961177 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.990382 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.994018 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 08:06:09 crc kubenswrapper[4715]: I1009 08:06:09.994386 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.022714 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.046708 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.047678 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.047863 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d70-200a-41e8-a3c9-b261a7fec4da-run-httpd\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.047947 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-scripts\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.047975 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-config-data\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.048030 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cm88\" (UniqueName: \"kubernetes.io/projected/7eee7d70-200a-41e8-a3c9-b261a7fec4da-kube-api-access-5cm88\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.048155 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d70-200a-41e8-a3c9-b261a7fec4da-log-httpd\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.048197 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.150960 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-config-data\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.151128 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cm88\" (UniqueName: \"kubernetes.io/projected/7eee7d70-200a-41e8-a3c9-b261a7fec4da-kube-api-access-5cm88\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.151222 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d70-200a-41e8-a3c9-b261a7fec4da-log-httpd\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.151281 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.151414 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.151552 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.151638 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d70-200a-41e8-a3c9-b261a7fec4da-run-httpd\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.151691 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-scripts\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.151921 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d70-200a-41e8-a3c9-b261a7fec4da-log-httpd\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.152195 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d70-200a-41e8-a3c9-b261a7fec4da-run-httpd\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.153118 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1" path="/var/lib/kubelet/pods/7aa70e3b-f722-42fd-94b1-cc36e2d9b6d1/volumes" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.153469 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.154958 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.155015 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.164590 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.164752 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.166901 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-config-data\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.168416 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.169012 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-scripts\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.175162 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cm88\" (UniqueName: \"kubernetes.io/projected/7eee7d70-200a-41e8-a3c9-b261a7fec4da-kube-api-access-5cm88\") pod \"ceilometer-0\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.311990 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.409374 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:10 crc kubenswrapper[4715]: I1009 08:06:10.793035 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:11 crc kubenswrapper[4715]: I1009 08:06:11.624700 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7eee7d70-200a-41e8-a3c9-b261a7fec4da","Type":"ContainerStarted","Data":"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350"} Oct 09 08:06:11 crc kubenswrapper[4715]: I1009 08:06:11.625236 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7eee7d70-200a-41e8-a3c9-b261a7fec4da","Type":"ContainerStarted","Data":"713d02fa77dc7116287d2ec90cb528c98041b6ba94bc76514de91ffa6be2cc1e"} Oct 09 08:06:12 crc kubenswrapper[4715]: W1009 08:06:12.080761 4715 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05e9f66e_c78b_4506_b4d1_93c82402efdc.slice/crio-0431e6f11bae145045463e3a59df81f14b8994e473b3d88d68b9ca691242cc6c/pids.max": read /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05e9f66e_c78b_4506_b4d1_93c82402efdc.slice/crio-0431e6f11bae145045463e3a59df81f14b8994e473b3d88d68b9ca691242cc6c/pids.max: no such device Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.447999 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.599178 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-combined-ca-bundle\") pod \"05e9f66e-c78b-4506-b4d1-93c82402efdc\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.599679 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mksdf\" (UniqueName: \"kubernetes.io/projected/05e9f66e-c78b-4506-b4d1-93c82402efdc-kube-api-access-mksdf\") pod \"05e9f66e-c78b-4506-b4d1-93c82402efdc\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.599728 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e9f66e-c78b-4506-b4d1-93c82402efdc-logs\") pod \"05e9f66e-c78b-4506-b4d1-93c82402efdc\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.599745 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-config-data\") pod \"05e9f66e-c78b-4506-b4d1-93c82402efdc\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.601092 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05e9f66e-c78b-4506-b4d1-93c82402efdc-logs" (OuterVolumeSpecName: "logs") pod "05e9f66e-c78b-4506-b4d1-93c82402efdc" (UID: "05e9f66e-c78b-4506-b4d1-93c82402efdc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.605114 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e9f66e-c78b-4506-b4d1-93c82402efdc-kube-api-access-mksdf" (OuterVolumeSpecName: "kube-api-access-mksdf") pod "05e9f66e-c78b-4506-b4d1-93c82402efdc" (UID: "05e9f66e-c78b-4506-b4d1-93c82402efdc"). InnerVolumeSpecName "kube-api-access-mksdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:06:12 crc kubenswrapper[4715]: E1009 08:06:12.634750 4715 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-combined-ca-bundle podName:05e9f66e-c78b-4506-b4d1-93c82402efdc nodeName:}" failed. No retries permitted until 2025-10-09 08:06:13.134677537 +0000 UTC m=+1203.827481565 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-combined-ca-bundle") pod "05e9f66e-c78b-4506-b4d1-93c82402efdc" (UID: "05e9f66e-c78b-4506-b4d1-93c82402efdc") : error deleting /var/lib/kubelet/pods/05e9f66e-c78b-4506-b4d1-93c82402efdc/volume-subpaths: remove /var/lib/kubelet/pods/05e9f66e-c78b-4506-b4d1-93c82402efdc/volume-subpaths: no such file or directory Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.637548 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-config-data" (OuterVolumeSpecName: "config-data") pod "05e9f66e-c78b-4506-b4d1-93c82402efdc" (UID: "05e9f66e-c78b-4506-b4d1-93c82402efdc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.645199 4715 generic.go:334] "Generic (PLEG): container finished" podID="05e9f66e-c78b-4506-b4d1-93c82402efdc" containerID="7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc" exitCode=0 Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.645353 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.646107 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"05e9f66e-c78b-4506-b4d1-93c82402efdc","Type":"ContainerDied","Data":"7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc"} Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.646156 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"05e9f66e-c78b-4506-b4d1-93c82402efdc","Type":"ContainerDied","Data":"0431e6f11bae145045463e3a59df81f14b8994e473b3d88d68b9ca691242cc6c"} Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.646177 4715 scope.go:117] "RemoveContainer" containerID="7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc" Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.650332 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7eee7d70-200a-41e8-a3c9-b261a7fec4da","Type":"ContainerStarted","Data":"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d"} Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.669441 4715 scope.go:117] "RemoveContainer" containerID="f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0" Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.692334 4715 scope.go:117] "RemoveContainer" containerID="7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc" Oct 09 08:06:12 crc kubenswrapper[4715]: E1009 08:06:12.692842 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc\": container with ID starting with 7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc not found: ID does not exist" containerID="7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc" Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.692925 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc"} err="failed to get container status \"7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc\": rpc error: code = NotFound desc = could not find container \"7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc\": container with ID starting with 7557437e362bec4f79a115a069822c356291ca9963ddbab97f253d8440e168dc not found: ID does not exist" Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.692951 4715 scope.go:117] "RemoveContainer" containerID="f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0" Oct 09 08:06:12 crc kubenswrapper[4715]: E1009 08:06:12.693393 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0\": container with ID starting with f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0 not found: ID does not exist" containerID="f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0" Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.693432 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0"} err="failed to get container status \"f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0\": rpc error: code = NotFound desc = could not find container \"f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0\": container with ID starting with f119d75b141b5624fec2caa5dc8783e4b9ada11d759df8c696737c6a5e27fce0 not found: ID does not exist" Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.702833 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mksdf\" (UniqueName: \"kubernetes.io/projected/05e9f66e-c78b-4506-b4d1-93c82402efdc-kube-api-access-mksdf\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.703189 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:12 crc kubenswrapper[4715]: I1009 08:06:12.703202 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05e9f66e-c78b-4506-b4d1-93c82402efdc-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.214477 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-combined-ca-bundle\") pod \"05e9f66e-c78b-4506-b4d1-93c82402efdc\" (UID: \"05e9f66e-c78b-4506-b4d1-93c82402efdc\") " Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.223048 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05e9f66e-c78b-4506-b4d1-93c82402efdc" (UID: "05e9f66e-c78b-4506-b4d1-93c82402efdc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.311345 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.317603 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05e9f66e-c78b-4506-b4d1-93c82402efdc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.319952 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.334246 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 08:06:13 crc kubenswrapper[4715]: E1009 08:06:13.334851 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e9f66e-c78b-4506-b4d1-93c82402efdc" containerName="nova-api-api" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.334874 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e9f66e-c78b-4506-b4d1-93c82402efdc" containerName="nova-api-api" Oct 09 08:06:13 crc kubenswrapper[4715]: E1009 08:06:13.334915 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e9f66e-c78b-4506-b4d1-93c82402efdc" containerName="nova-api-log" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.334924 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e9f66e-c78b-4506-b4d1-93c82402efdc" containerName="nova-api-log" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.335159 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e9f66e-c78b-4506-b4d1-93c82402efdc" containerName="nova-api-log" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.335201 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e9f66e-c78b-4506-b4d1-93c82402efdc" containerName="nova-api-api" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.336441 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.342304 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.342487 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.342606 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.376701 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.420706 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-config-data\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.420793 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.420835 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-public-tls-certs\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.420904 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-logs\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.420927 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fs4f\" (UniqueName: \"kubernetes.io/projected/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-kube-api-access-8fs4f\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.421003 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.523220 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.523307 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-config-data\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.523360 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.523388 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-public-tls-certs\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.523451 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-logs\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.523470 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fs4f\" (UniqueName: \"kubernetes.io/projected/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-kube-api-access-8fs4f\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.524110 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-logs\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.527841 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-public-tls-certs\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.527893 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.528386 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-config-data\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.535948 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.541473 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fs4f\" (UniqueName: \"kubernetes.io/projected/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-kube-api-access-8fs4f\") pod \"nova-api-0\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " pod="openstack/nova-api-0" Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.661214 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7eee7d70-200a-41e8-a3c9-b261a7fec4da","Type":"ContainerStarted","Data":"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e"} Oct 09 08:06:13 crc kubenswrapper[4715]: I1009 08:06:13.673061 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:06:14 crc kubenswrapper[4715]: I1009 08:06:14.114729 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:06:14 crc kubenswrapper[4715]: I1009 08:06:14.149559 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05e9f66e-c78b-4506-b4d1-93c82402efdc" path="/var/lib/kubelet/pods/05e9f66e-c78b-4506-b4d1-93c82402efdc/volumes" Oct 09 08:06:14 crc kubenswrapper[4715]: I1009 08:06:14.671822 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f","Type":"ContainerStarted","Data":"c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283"} Oct 09 08:06:14 crc kubenswrapper[4715]: I1009 08:06:14.672134 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f","Type":"ContainerStarted","Data":"12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0"} Oct 09 08:06:14 crc kubenswrapper[4715]: I1009 08:06:14.672151 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f","Type":"ContainerStarted","Data":"28d4e0c4cae72829e61e2a88517987c4372782edbd0dd2d57cba7c39fadf7cde"} Oct 09 08:06:14 crc kubenswrapper[4715]: I1009 08:06:14.706013 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.7059957529999998 podStartE2EDuration="1.705995753s" podCreationTimestamp="2025-10-09 08:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:06:14.698715213 +0000 UTC m=+1205.391519221" watchObservedRunningTime="2025-10-09 08:06:14.705995753 +0000 UTC m=+1205.398799761" Oct 09 08:06:14 crc kubenswrapper[4715]: I1009 08:06:14.894784 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:14 crc kubenswrapper[4715]: I1009 08:06:14.921223 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.685967 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7eee7d70-200a-41e8-a3c9-b261a7fec4da","Type":"ContainerStarted","Data":"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457"} Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.686398 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="ceilometer-central-agent" containerID="cri-o://8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350" gracePeriod=30 Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.686720 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="proxy-httpd" containerID="cri-o://11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457" gracePeriod=30 Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.686736 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="sg-core" containerID="cri-o://c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e" gracePeriod=30 Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.686753 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="ceilometer-notification-agent" containerID="cri-o://68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d" gracePeriod=30 Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.717438 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.734175 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.4515883880000002 podStartE2EDuration="6.734150655s" podCreationTimestamp="2025-10-09 08:06:09 +0000 UTC" firstStartedPulling="2025-10-09 08:06:10.800953024 +0000 UTC m=+1201.493757032" lastFinishedPulling="2025-10-09 08:06:15.083515291 +0000 UTC m=+1205.776319299" observedRunningTime="2025-10-09 08:06:15.723021264 +0000 UTC m=+1206.415825272" watchObservedRunningTime="2025-10-09 08:06:15.734150655 +0000 UTC m=+1206.426954673" Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.905771 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-lwr98"] Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.907760 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.909908 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.909909 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.913021 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-lwr98"] Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.969898 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lwr98\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.970144 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-config-data\") pod \"nova-cell1-cell-mapping-lwr98\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.970308 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-scripts\") pod \"nova-cell1-cell-mapping-lwr98\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:15 crc kubenswrapper[4715]: I1009 08:06:15.970457 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmk8w\" (UniqueName: \"kubernetes.io/projected/8c64d743-936e-460c-87d8-d0aea119fc3c-kube-api-access-wmk8w\") pod \"nova-cell1-cell-mapping-lwr98\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.068590 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.071932 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lwr98\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.071978 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-config-data\") pod \"nova-cell1-cell-mapping-lwr98\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.072047 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-scripts\") pod \"nova-cell1-cell-mapping-lwr98\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.072109 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmk8w\" (UniqueName: \"kubernetes.io/projected/8c64d743-936e-460c-87d8-d0aea119fc3c-kube-api-access-wmk8w\") pod \"nova-cell1-cell-mapping-lwr98\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.078149 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-config-data\") pod \"nova-cell1-cell-mapping-lwr98\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.078268 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-scripts\") pod \"nova-cell1-cell-mapping-lwr98\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.078313 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-lwr98\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.092253 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmk8w\" (UniqueName: \"kubernetes.io/projected/8c64d743-936e-460c-87d8-d0aea119fc3c-kube-api-access-wmk8w\") pod \"nova-cell1-cell-mapping-lwr98\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.131371 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-fp7xt"] Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.131603 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" podUID="70aefddd-4fff-4560-a534-52b0e9ea0f8f" containerName="dnsmasq-dns" containerID="cri-o://1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013" gracePeriod=10 Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.245036 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.572589 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.636162 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683041 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-config-data\") pod \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683084 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-scripts\") pod \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683124 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-ovsdbserver-nb\") pod \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683142 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-dns-svc\") pod \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683163 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-config\") pod \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683194 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-ovsdbserver-sb\") pod \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683215 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-dns-swift-storage-0\") pod \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683264 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d70-200a-41e8-a3c9-b261a7fec4da-log-httpd\") pod \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683336 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-combined-ca-bundle\") pod \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683394 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s28lr\" (UniqueName: \"kubernetes.io/projected/70aefddd-4fff-4560-a534-52b0e9ea0f8f-kube-api-access-s28lr\") pod \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\" (UID: \"70aefddd-4fff-4560-a534-52b0e9ea0f8f\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683434 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-sg-core-conf-yaml\") pod \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683467 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d70-200a-41e8-a3c9-b261a7fec4da-run-httpd\") pod \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683540 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-ceilometer-tls-certs\") pod \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.683566 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cm88\" (UniqueName: \"kubernetes.io/projected/7eee7d70-200a-41e8-a3c9-b261a7fec4da-kube-api-access-5cm88\") pod \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.686152 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eee7d70-200a-41e8-a3c9-b261a7fec4da-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7eee7d70-200a-41e8-a3c9-b261a7fec4da" (UID: "7eee7d70-200a-41e8-a3c9-b261a7fec4da"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.686880 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eee7d70-200a-41e8-a3c9-b261a7fec4da-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7eee7d70-200a-41e8-a3c9-b261a7fec4da" (UID: "7eee7d70-200a-41e8-a3c9-b261a7fec4da"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.695323 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70aefddd-4fff-4560-a534-52b0e9ea0f8f-kube-api-access-s28lr" (OuterVolumeSpecName: "kube-api-access-s28lr") pod "70aefddd-4fff-4560-a534-52b0e9ea0f8f" (UID: "70aefddd-4fff-4560-a534-52b0e9ea0f8f"). InnerVolumeSpecName "kube-api-access-s28lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.697648 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eee7d70-200a-41e8-a3c9-b261a7fec4da-kube-api-access-5cm88" (OuterVolumeSpecName: "kube-api-access-5cm88") pod "7eee7d70-200a-41e8-a3c9-b261a7fec4da" (UID: "7eee7d70-200a-41e8-a3c9-b261a7fec4da"). InnerVolumeSpecName "kube-api-access-5cm88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.700468 4715 generic.go:334] "Generic (PLEG): container finished" podID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerID="11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457" exitCode=0 Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.700529 4715 generic.go:334] "Generic (PLEG): container finished" podID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerID="c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e" exitCode=2 Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.700540 4715 generic.go:334] "Generic (PLEG): container finished" podID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerID="68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d" exitCode=0 Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.700549 4715 generic.go:334] "Generic (PLEG): container finished" podID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerID="8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350" exitCode=0 Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.700571 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.700650 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7eee7d70-200a-41e8-a3c9-b261a7fec4da","Type":"ContainerDied","Data":"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457"} Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.700714 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7eee7d70-200a-41e8-a3c9-b261a7fec4da","Type":"ContainerDied","Data":"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e"} Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.700729 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7eee7d70-200a-41e8-a3c9-b261a7fec4da","Type":"ContainerDied","Data":"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d"} Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.700742 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7eee7d70-200a-41e8-a3c9-b261a7fec4da","Type":"ContainerDied","Data":"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350"} Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.700794 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7eee7d70-200a-41e8-a3c9-b261a7fec4da","Type":"ContainerDied","Data":"713d02fa77dc7116287d2ec90cb528c98041b6ba94bc76514de91ffa6be2cc1e"} Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.700812 4715 scope.go:117] "RemoveContainer" containerID="11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.704000 4715 generic.go:334] "Generic (PLEG): container finished" podID="70aefddd-4fff-4560-a534-52b0e9ea0f8f" containerID="1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013" exitCode=0 Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.704056 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" event={"ID":"70aefddd-4fff-4560-a534-52b0e9ea0f8f","Type":"ContainerDied","Data":"1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013"} Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.704183 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" event={"ID":"70aefddd-4fff-4560-a534-52b0e9ea0f8f","Type":"ContainerDied","Data":"0c66fe91164a08c5dd0e3e59dcb6d6a6dae99e25af7658e90f8b5c0a19eb4661"} Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.704068 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.712265 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-scripts" (OuterVolumeSpecName: "scripts") pod "7eee7d70-200a-41e8-a3c9-b261a7fec4da" (UID: "7eee7d70-200a-41e8-a3c9-b261a7fec4da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.730031 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7eee7d70-200a-41e8-a3c9-b261a7fec4da" (UID: "7eee7d70-200a-41e8-a3c9-b261a7fec4da"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.740329 4715 scope.go:117] "RemoveContainer" containerID="c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.753129 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.753183 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.766287 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "70aefddd-4fff-4560-a534-52b0e9ea0f8f" (UID: "70aefddd-4fff-4560-a534-52b0e9ea0f8f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.767656 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "70aefddd-4fff-4560-a534-52b0e9ea0f8f" (UID: "70aefddd-4fff-4560-a534-52b0e9ea0f8f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.774539 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-config" (OuterVolumeSpecName: "config") pod "70aefddd-4fff-4560-a534-52b0e9ea0f8f" (UID: "70aefddd-4fff-4560-a534-52b0e9ea0f8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.779140 4715 scope.go:117] "RemoveContainer" containerID="68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.785587 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7eee7d70-200a-41e8-a3c9-b261a7fec4da" (UID: "7eee7d70-200a-41e8-a3c9-b261a7fec4da"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.786763 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-ceilometer-tls-certs\") pod \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\" (UID: \"7eee7d70-200a-41e8-a3c9-b261a7fec4da\") " Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.786809 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "70aefddd-4fff-4560-a534-52b0e9ea0f8f" (UID: "70aefddd-4fff-4560-a534-52b0e9ea0f8f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: W1009 08:06:16.787010 4715 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/7eee7d70-200a-41e8-a3c9-b261a7fec4da/volumes/kubernetes.io~secret/ceilometer-tls-certs Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.787044 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7eee7d70-200a-41e8-a3c9-b261a7fec4da" (UID: "7eee7d70-200a-41e8-a3c9-b261a7fec4da"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.789310 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.789340 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.789721 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.789739 4715 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.789753 4715 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d70-200a-41e8-a3c9-b261a7fec4da-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.789765 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s28lr\" (UniqueName: \"kubernetes.io/projected/70aefddd-4fff-4560-a534-52b0e9ea0f8f-kube-api-access-s28lr\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.789778 4715 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.789787 4715 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eee7d70-200a-41e8-a3c9-b261a7fec4da-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.789794 4715 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.789803 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cm88\" (UniqueName: \"kubernetes.io/projected/7eee7d70-200a-41e8-a3c9-b261a7fec4da-kube-api-access-5cm88\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.789810 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.791775 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "70aefddd-4fff-4560-a534-52b0e9ea0f8f" (UID: "70aefddd-4fff-4560-a534-52b0e9ea0f8f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.801576 4715 scope.go:117] "RemoveContainer" containerID="8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.803174 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7eee7d70-200a-41e8-a3c9-b261a7fec4da" (UID: "7eee7d70-200a-41e8-a3c9-b261a7fec4da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: W1009 08:06:16.816329 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c64d743_936e_460c_87d8_d0aea119fc3c.slice/crio-a91b7b055347242c9214753ca70dfcf191e9ccce9b6e2dd48b5dfa5b791aed0d WatchSource:0}: Error finding container a91b7b055347242c9214753ca70dfcf191e9ccce9b6e2dd48b5dfa5b791aed0d: Status 404 returned error can't find the container with id a91b7b055347242c9214753ca70dfcf191e9ccce9b6e2dd48b5dfa5b791aed0d Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.818302 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-lwr98"] Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.827802 4715 scope.go:117] "RemoveContainer" containerID="11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457" Oct 09 08:06:16 crc kubenswrapper[4715]: E1009 08:06:16.828284 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457\": container with ID starting with 11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457 not found: ID does not exist" containerID="11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.828339 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457"} err="failed to get container status \"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457\": rpc error: code = NotFound desc = could not find container \"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457\": container with ID starting with 11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457 not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.828368 4715 scope.go:117] "RemoveContainer" containerID="c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e" Oct 09 08:06:16 crc kubenswrapper[4715]: E1009 08:06:16.828684 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e\": container with ID starting with c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e not found: ID does not exist" containerID="c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.828718 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e"} err="failed to get container status \"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e\": rpc error: code = NotFound desc = could not find container \"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e\": container with ID starting with c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.828733 4715 scope.go:117] "RemoveContainer" containerID="68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d" Oct 09 08:06:16 crc kubenswrapper[4715]: E1009 08:06:16.829211 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d\": container with ID starting with 68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d not found: ID does not exist" containerID="68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.829260 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d"} err="failed to get container status \"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d\": rpc error: code = NotFound desc = could not find container \"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d\": container with ID starting with 68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.829274 4715 scope.go:117] "RemoveContainer" containerID="8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350" Oct 09 08:06:16 crc kubenswrapper[4715]: E1009 08:06:16.829582 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350\": container with ID starting with 8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350 not found: ID does not exist" containerID="8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.829623 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350"} err="failed to get container status \"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350\": rpc error: code = NotFound desc = could not find container \"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350\": container with ID starting with 8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350 not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.829636 4715 scope.go:117] "RemoveContainer" containerID="11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.829816 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457"} err="failed to get container status \"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457\": rpc error: code = NotFound desc = could not find container \"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457\": container with ID starting with 11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457 not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.829834 4715 scope.go:117] "RemoveContainer" containerID="c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.829985 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e"} err="failed to get container status \"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e\": rpc error: code = NotFound desc = could not find container \"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e\": container with ID starting with c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.830002 4715 scope.go:117] "RemoveContainer" containerID="68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.830210 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d"} err="failed to get container status \"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d\": rpc error: code = NotFound desc = could not find container \"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d\": container with ID starting with 68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.830227 4715 scope.go:117] "RemoveContainer" containerID="8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.830407 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350"} err="failed to get container status \"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350\": rpc error: code = NotFound desc = could not find container \"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350\": container with ID starting with 8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350 not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.830435 4715 scope.go:117] "RemoveContainer" containerID="11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.830616 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457"} err="failed to get container status \"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457\": rpc error: code = NotFound desc = could not find container \"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457\": container with ID starting with 11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457 not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.830631 4715 scope.go:117] "RemoveContainer" containerID="c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.830786 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e"} err="failed to get container status \"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e\": rpc error: code = NotFound desc = could not find container \"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e\": container with ID starting with c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.830804 4715 scope.go:117] "RemoveContainer" containerID="68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.830949 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d"} err="failed to get container status \"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d\": rpc error: code = NotFound desc = could not find container \"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d\": container with ID starting with 68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.830970 4715 scope.go:117] "RemoveContainer" containerID="8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.831009 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-config-data" (OuterVolumeSpecName: "config-data") pod "7eee7d70-200a-41e8-a3c9-b261a7fec4da" (UID: "7eee7d70-200a-41e8-a3c9-b261a7fec4da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.831114 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350"} err="failed to get container status \"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350\": rpc error: code = NotFound desc = could not find container \"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350\": container with ID starting with 8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350 not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.831130 4715 scope.go:117] "RemoveContainer" containerID="11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.831325 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457"} err="failed to get container status \"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457\": rpc error: code = NotFound desc = could not find container \"11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457\": container with ID starting with 11856ad085495bc48c917c82ea0a9a0faf5f1171890507104f2e0dd708e1f457 not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.831359 4715 scope.go:117] "RemoveContainer" containerID="c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.831531 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e"} err="failed to get container status \"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e\": rpc error: code = NotFound desc = could not find container \"c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e\": container with ID starting with c035dc6ccfc492914aaec215d7d41274ead7e19b9fec8d46ad5f5e64e376b24e not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.831545 4715 scope.go:117] "RemoveContainer" containerID="68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.831695 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d"} err="failed to get container status \"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d\": rpc error: code = NotFound desc = could not find container \"68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d\": container with ID starting with 68f9a7d4bc7d688521a23f5d3772036924f8ee5df7172771ce083d9d6124ae4d not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.831711 4715 scope.go:117] "RemoveContainer" containerID="8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.831861 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350"} err="failed to get container status \"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350\": rpc error: code = NotFound desc = could not find container \"8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350\": container with ID starting with 8561e97f249167c76e343c82cce1ec9eb2265975a7d8ae5e2ef3470cb463f350 not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.831876 4715 scope.go:117] "RemoveContainer" containerID="1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.869310 4715 scope.go:117] "RemoveContainer" containerID="e51fe84727058b5333e0abf45df1fd9a1f9abdd61d81d89a25db14d36fb3e622" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.892076 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.892199 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/70aefddd-4fff-4560-a534-52b0e9ea0f8f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.892215 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eee7d70-200a-41e8-a3c9-b261a7fec4da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.899955 4715 scope.go:117] "RemoveContainer" containerID="1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013" Oct 09 08:06:16 crc kubenswrapper[4715]: E1009 08:06:16.900445 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013\": container with ID starting with 1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013 not found: ID does not exist" containerID="1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.900474 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013"} err="failed to get container status \"1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013\": rpc error: code = NotFound desc = could not find container \"1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013\": container with ID starting with 1bf49bee2a619d42d47eb12fd200f320e269f3a75455f02b36faa685504a1013 not found: ID does not exist" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.900495 4715 scope.go:117] "RemoveContainer" containerID="e51fe84727058b5333e0abf45df1fd9a1f9abdd61d81d89a25db14d36fb3e622" Oct 09 08:06:16 crc kubenswrapper[4715]: E1009 08:06:16.901155 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e51fe84727058b5333e0abf45df1fd9a1f9abdd61d81d89a25db14d36fb3e622\": container with ID starting with e51fe84727058b5333e0abf45df1fd9a1f9abdd61d81d89a25db14d36fb3e622 not found: ID does not exist" containerID="e51fe84727058b5333e0abf45df1fd9a1f9abdd61d81d89a25db14d36fb3e622" Oct 09 08:06:16 crc kubenswrapper[4715]: I1009 08:06:16.901180 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e51fe84727058b5333e0abf45df1fd9a1f9abdd61d81d89a25db14d36fb3e622"} err="failed to get container status \"e51fe84727058b5333e0abf45df1fd9a1f9abdd61d81d89a25db14d36fb3e622\": rpc error: code = NotFound desc = could not find container \"e51fe84727058b5333e0abf45df1fd9a1f9abdd61d81d89a25db14d36fb3e622\": container with ID starting with e51fe84727058b5333e0abf45df1fd9a1f9abdd61d81d89a25db14d36fb3e622 not found: ID does not exist" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.037123 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.048934 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.059140 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-fp7xt"] Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.067052 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-fp7xt"] Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.076204 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:17 crc kubenswrapper[4715]: E1009 08:06:17.076690 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="proxy-httpd" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.076714 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="proxy-httpd" Oct 09 08:06:17 crc kubenswrapper[4715]: E1009 08:06:17.076730 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="sg-core" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.076738 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="sg-core" Oct 09 08:06:17 crc kubenswrapper[4715]: E1009 08:06:17.076764 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="ceilometer-central-agent" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.076773 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="ceilometer-central-agent" Oct 09 08:06:17 crc kubenswrapper[4715]: E1009 08:06:17.076795 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70aefddd-4fff-4560-a534-52b0e9ea0f8f" containerName="init" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.076803 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="70aefddd-4fff-4560-a534-52b0e9ea0f8f" containerName="init" Oct 09 08:06:17 crc kubenswrapper[4715]: E1009 08:06:17.076839 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70aefddd-4fff-4560-a534-52b0e9ea0f8f" containerName="dnsmasq-dns" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.076848 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="70aefddd-4fff-4560-a534-52b0e9ea0f8f" containerName="dnsmasq-dns" Oct 09 08:06:17 crc kubenswrapper[4715]: E1009 08:06:17.076862 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="ceilometer-notification-agent" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.076869 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="ceilometer-notification-agent" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.077058 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="sg-core" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.077075 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="proxy-httpd" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.077084 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="ceilometer-central-agent" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.077093 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" containerName="ceilometer-notification-agent" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.077113 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="70aefddd-4fff-4560-a534-52b0e9ea0f8f" containerName="dnsmasq-dns" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.079160 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.081807 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.081983 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.082289 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.086987 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.196801 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7574fa02-4d40-4d5f-8d52-3118db1c2e05-run-httpd\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.197188 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-config-data\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.197284 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.197339 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7574fa02-4d40-4d5f-8d52-3118db1c2e05-log-httpd\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.197433 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw6p5\" (UniqueName: \"kubernetes.io/projected/7574fa02-4d40-4d5f-8d52-3118db1c2e05-kube-api-access-mw6p5\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.197470 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.197498 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.197528 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-scripts\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.299173 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw6p5\" (UniqueName: \"kubernetes.io/projected/7574fa02-4d40-4d5f-8d52-3118db1c2e05-kube-api-access-mw6p5\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.299261 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.299303 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.299352 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-scripts\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.299397 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7574fa02-4d40-4d5f-8d52-3118db1c2e05-run-httpd\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.299442 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-config-data\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.299524 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.299579 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7574fa02-4d40-4d5f-8d52-3118db1c2e05-log-httpd\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.300075 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7574fa02-4d40-4d5f-8d52-3118db1c2e05-log-httpd\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.300329 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7574fa02-4d40-4d5f-8d52-3118db1c2e05-run-httpd\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.305106 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-scripts\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.306473 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-config-data\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.313201 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.313940 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.314516 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7574fa02-4d40-4d5f-8d52-3118db1c2e05-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.319179 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw6p5\" (UniqueName: \"kubernetes.io/projected/7574fa02-4d40-4d5f-8d52-3118db1c2e05-kube-api-access-mw6p5\") pod \"ceilometer-0\" (UID: \"7574fa02-4d40-4d5f-8d52-3118db1c2e05\") " pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.397682 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.717112 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lwr98" event={"ID":"8c64d743-936e-460c-87d8-d0aea119fc3c","Type":"ContainerStarted","Data":"67d55c17eb96658a6efaaf2f7731c75bd85723bfee29bd352cdfe0b8a72ce9ff"} Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.717453 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lwr98" event={"ID":"8c64d743-936e-460c-87d8-d0aea119fc3c","Type":"ContainerStarted","Data":"a91b7b055347242c9214753ca70dfcf191e9ccce9b6e2dd48b5dfa5b791aed0d"} Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.744116 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-lwr98" podStartSLOduration=2.74409304 podStartE2EDuration="2.74409304s" podCreationTimestamp="2025-10-09 08:06:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:06:17.732411263 +0000 UTC m=+1208.425215281" watchObservedRunningTime="2025-10-09 08:06:17.74409304 +0000 UTC m=+1208.436897048" Oct 09 08:06:17 crc kubenswrapper[4715]: I1009 08:06:17.872804 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 09 08:06:18 crc kubenswrapper[4715]: I1009 08:06:18.148734 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70aefddd-4fff-4560-a534-52b0e9ea0f8f" path="/var/lib/kubelet/pods/70aefddd-4fff-4560-a534-52b0e9ea0f8f/volumes" Oct 09 08:06:18 crc kubenswrapper[4715]: I1009 08:06:18.149468 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eee7d70-200a-41e8-a3c9-b261a7fec4da" path="/var/lib/kubelet/pods/7eee7d70-200a-41e8-a3c9-b261a7fec4da/volumes" Oct 09 08:06:18 crc kubenswrapper[4715]: I1009 08:06:18.736438 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7574fa02-4d40-4d5f-8d52-3118db1c2e05","Type":"ContainerStarted","Data":"d3fe320904441406586bbee73a988e3069ce77203db7822cadc12ac21b8a45a2"} Oct 09 08:06:19 crc kubenswrapper[4715]: I1009 08:06:19.748132 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7574fa02-4d40-4d5f-8d52-3118db1c2e05","Type":"ContainerStarted","Data":"4473be2ceebbbe5410c04e192dea655c466faf65988fb40e9f175174438d2564"} Oct 09 08:06:19 crc kubenswrapper[4715]: I1009 08:06:19.748458 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7574fa02-4d40-4d5f-8d52-3118db1c2e05","Type":"ContainerStarted","Data":"60547a2a8b749025a07ef848ab68783bd9dbb06e8815e0984820ac67a286df08"} Oct 09 08:06:20 crc kubenswrapper[4715]: I1009 08:06:20.760341 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7574fa02-4d40-4d5f-8d52-3118db1c2e05","Type":"ContainerStarted","Data":"cc04270b76aff9cdcdd8f69b3db21f29d9a38a105216dc3d5037918e9ae987fc"} Oct 09 08:06:21 crc kubenswrapper[4715]: I1009 08:06:21.544386 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-845d6d6f59-fp7xt" podUID="70aefddd-4fff-4560-a534-52b0e9ea0f8f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.190:5353: i/o timeout" Oct 09 08:06:21 crc kubenswrapper[4715]: I1009 08:06:21.775556 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7574fa02-4d40-4d5f-8d52-3118db1c2e05","Type":"ContainerStarted","Data":"9812ffad29f93eb9f3cc1f2b6df108e7b86306608a9d7aaa3ffbec301d61ce78"} Oct 09 08:06:21 crc kubenswrapper[4715]: I1009 08:06:21.776171 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 09 08:06:21 crc kubenswrapper[4715]: I1009 08:06:21.779419 4715 generic.go:334] "Generic (PLEG): container finished" podID="8c64d743-936e-460c-87d8-d0aea119fc3c" containerID="67d55c17eb96658a6efaaf2f7731c75bd85723bfee29bd352cdfe0b8a72ce9ff" exitCode=0 Oct 09 08:06:21 crc kubenswrapper[4715]: I1009 08:06:21.779482 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lwr98" event={"ID":"8c64d743-936e-460c-87d8-d0aea119fc3c","Type":"ContainerDied","Data":"67d55c17eb96658a6efaaf2f7731c75bd85723bfee29bd352cdfe0b8a72ce9ff"} Oct 09 08:06:21 crc kubenswrapper[4715]: I1009 08:06:21.801959 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.483953977 podStartE2EDuration="4.801921335s" podCreationTimestamp="2025-10-09 08:06:17 +0000 UTC" firstStartedPulling="2025-10-09 08:06:17.888971908 +0000 UTC m=+1208.581775916" lastFinishedPulling="2025-10-09 08:06:21.206939266 +0000 UTC m=+1211.899743274" observedRunningTime="2025-10-09 08:06:21.792224835 +0000 UTC m=+1212.485028843" watchObservedRunningTime="2025-10-09 08:06:21.801921335 +0000 UTC m=+1212.494725353" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.201912 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.331392 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-combined-ca-bundle\") pod \"8c64d743-936e-460c-87d8-d0aea119fc3c\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.331462 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmk8w\" (UniqueName: \"kubernetes.io/projected/8c64d743-936e-460c-87d8-d0aea119fc3c-kube-api-access-wmk8w\") pod \"8c64d743-936e-460c-87d8-d0aea119fc3c\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.331657 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-config-data\") pod \"8c64d743-936e-460c-87d8-d0aea119fc3c\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.331685 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-scripts\") pod \"8c64d743-936e-460c-87d8-d0aea119fc3c\" (UID: \"8c64d743-936e-460c-87d8-d0aea119fc3c\") " Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.337463 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-scripts" (OuterVolumeSpecName: "scripts") pod "8c64d743-936e-460c-87d8-d0aea119fc3c" (UID: "8c64d743-936e-460c-87d8-d0aea119fc3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.338112 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c64d743-936e-460c-87d8-d0aea119fc3c-kube-api-access-wmk8w" (OuterVolumeSpecName: "kube-api-access-wmk8w") pod "8c64d743-936e-460c-87d8-d0aea119fc3c" (UID: "8c64d743-936e-460c-87d8-d0aea119fc3c"). InnerVolumeSpecName "kube-api-access-wmk8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.358043 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c64d743-936e-460c-87d8-d0aea119fc3c" (UID: "8c64d743-936e-460c-87d8-d0aea119fc3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.368851 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-config-data" (OuterVolumeSpecName: "config-data") pod "8c64d743-936e-460c-87d8-d0aea119fc3c" (UID: "8c64d743-936e-460c-87d8-d0aea119fc3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.434345 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.434383 4715 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-scripts\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.434399 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c64d743-936e-460c-87d8-d0aea119fc3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.434416 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmk8w\" (UniqueName: \"kubernetes.io/projected/8c64d743-936e-460c-87d8-d0aea119fc3c-kube-api-access-wmk8w\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.674190 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.674241 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.802032 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-lwr98" Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.803354 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-lwr98" event={"ID":"8c64d743-936e-460c-87d8-d0aea119fc3c","Type":"ContainerDied","Data":"a91b7b055347242c9214753ca70dfcf191e9ccce9b6e2dd48b5dfa5b791aed0d"} Oct 09 08:06:23 crc kubenswrapper[4715]: I1009 08:06:23.803397 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a91b7b055347242c9214753ca70dfcf191e9ccce9b6e2dd48b5dfa5b791aed0d" Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.034048 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.034501 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" containerName="nova-api-log" containerID="cri-o://12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0" gracePeriod=30 Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.034622 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" containerName="nova-api-api" containerID="cri-o://c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283" gracePeriod=30 Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.055143 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": EOF" Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.055648 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": EOF" Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.063705 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.064063 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e6f4acd0-b524-4d1c-a245-6683e14aec4c" containerName="nova-scheduler-scheduler" containerID="cri-o://424f0e7059f488402a98735f472450d172266c66091300b7c178e878de33a243" gracePeriod=30 Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.091861 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.092094 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="377c1e43-3538-413a-9144-85708016acca" containerName="nova-metadata-log" containerID="cri-o://325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec" gracePeriod=30 Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.092227 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="377c1e43-3538-413a-9144-85708016acca" containerName="nova-metadata-metadata" containerID="cri-o://31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125" gracePeriod=30 Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.825189 4715 generic.go:334] "Generic (PLEG): container finished" podID="377c1e43-3538-413a-9144-85708016acca" containerID="325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec" exitCode=143 Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.825275 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"377c1e43-3538-413a-9144-85708016acca","Type":"ContainerDied","Data":"325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec"} Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.827196 4715 generic.go:334] "Generic (PLEG): container finished" podID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" containerID="12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0" exitCode=143 Oct 09 08:06:24 crc kubenswrapper[4715]: I1009 08:06:24.827225 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f","Type":"ContainerDied","Data":"12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0"} Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.227527 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="377c1e43-3538-413a-9144-85708016acca" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:51806->10.217.0.195:8775: read: connection reset by peer" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.228272 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="377c1e43-3538-413a-9144-85708016acca" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:51804->10.217.0.195:8775: read: connection reset by peer" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.735917 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.818287 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/377c1e43-3538-413a-9144-85708016acca-logs\") pod \"377c1e43-3538-413a-9144-85708016acca\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.818410 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-combined-ca-bundle\") pod \"377c1e43-3538-413a-9144-85708016acca\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.818510 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-config-data\") pod \"377c1e43-3538-413a-9144-85708016acca\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.818610 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv5n2\" (UniqueName: \"kubernetes.io/projected/377c1e43-3538-413a-9144-85708016acca-kube-api-access-kv5n2\") pod \"377c1e43-3538-413a-9144-85708016acca\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.818655 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-nova-metadata-tls-certs\") pod \"377c1e43-3538-413a-9144-85708016acca\" (UID: \"377c1e43-3538-413a-9144-85708016acca\") " Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.820971 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/377c1e43-3538-413a-9144-85708016acca-logs" (OuterVolumeSpecName: "logs") pod "377c1e43-3538-413a-9144-85708016acca" (UID: "377c1e43-3538-413a-9144-85708016acca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.840678 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377c1e43-3538-413a-9144-85708016acca-kube-api-access-kv5n2" (OuterVolumeSpecName: "kube-api-access-kv5n2") pod "377c1e43-3538-413a-9144-85708016acca" (UID: "377c1e43-3538-413a-9144-85708016acca"). InnerVolumeSpecName "kube-api-access-kv5n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.855447 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-config-data" (OuterVolumeSpecName: "config-data") pod "377c1e43-3538-413a-9144-85708016acca" (UID: "377c1e43-3538-413a-9144-85708016acca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.862322 4715 generic.go:334] "Generic (PLEG): container finished" podID="e6f4acd0-b524-4d1c-a245-6683e14aec4c" containerID="424f0e7059f488402a98735f472450d172266c66091300b7c178e878de33a243" exitCode=0 Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.862385 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6f4acd0-b524-4d1c-a245-6683e14aec4c","Type":"ContainerDied","Data":"424f0e7059f488402a98735f472450d172266c66091300b7c178e878de33a243"} Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.869787 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "377c1e43-3538-413a-9144-85708016acca" (UID: "377c1e43-3538-413a-9144-85708016acca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.871190 4715 generic.go:334] "Generic (PLEG): container finished" podID="377c1e43-3538-413a-9144-85708016acca" containerID="31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125" exitCode=0 Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.871223 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"377c1e43-3538-413a-9144-85708016acca","Type":"ContainerDied","Data":"31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125"} Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.871250 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"377c1e43-3538-413a-9144-85708016acca","Type":"ContainerDied","Data":"656cfdd1759beac29f25cd577a904da4f2144cbd0a77ae52506738ce3d4bbd7b"} Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.871266 4715 scope.go:117] "RemoveContainer" containerID="31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.871441 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.903738 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "377c1e43-3538-413a-9144-85708016acca" (UID: "377c1e43-3538-413a-9144-85708016acca"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.922555 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.922609 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.922624 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv5n2\" (UniqueName: \"kubernetes.io/projected/377c1e43-3538-413a-9144-85708016acca-kube-api-access-kv5n2\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.922638 4715 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/377c1e43-3538-413a-9144-85708016acca-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.922653 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/377c1e43-3538-413a-9144-85708016acca-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.950580 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.956313 4715 scope.go:117] "RemoveContainer" containerID="325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.981062 4715 scope.go:117] "RemoveContainer" containerID="31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125" Oct 09 08:06:27 crc kubenswrapper[4715]: E1009 08:06:27.981570 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125\": container with ID starting with 31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125 not found: ID does not exist" containerID="31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.981605 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125"} err="failed to get container status \"31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125\": rpc error: code = NotFound desc = could not find container \"31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125\": container with ID starting with 31a196ecb9a173d9bf5b6e1b74be6a671c02066e244a9ec03203da2a418a7125 not found: ID does not exist" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.981628 4715 scope.go:117] "RemoveContainer" containerID="325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec" Oct 09 08:06:27 crc kubenswrapper[4715]: E1009 08:06:27.981908 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec\": container with ID starting with 325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec not found: ID does not exist" containerID="325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec" Oct 09 08:06:27 crc kubenswrapper[4715]: I1009 08:06:27.981930 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec"} err="failed to get container status \"325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec\": rpc error: code = NotFound desc = could not find container \"325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec\": container with ID starting with 325df8c6bd97ac42f43bdca1af25e4217b0820abd4078d6e324420c50bda4fec not found: ID does not exist" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.024046 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f4acd0-b524-4d1c-a245-6683e14aec4c-combined-ca-bundle\") pod \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\" (UID: \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\") " Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.024124 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f4acd0-b524-4d1c-a245-6683e14aec4c-config-data\") pod \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\" (UID: \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\") " Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.024328 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86p4d\" (UniqueName: \"kubernetes.io/projected/e6f4acd0-b524-4d1c-a245-6683e14aec4c-kube-api-access-86p4d\") pod \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\" (UID: \"e6f4acd0-b524-4d1c-a245-6683e14aec4c\") " Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.028853 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f4acd0-b524-4d1c-a245-6683e14aec4c-kube-api-access-86p4d" (OuterVolumeSpecName: "kube-api-access-86p4d") pod "e6f4acd0-b524-4d1c-a245-6683e14aec4c" (UID: "e6f4acd0-b524-4d1c-a245-6683e14aec4c"). InnerVolumeSpecName "kube-api-access-86p4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.055065 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f4acd0-b524-4d1c-a245-6683e14aec4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6f4acd0-b524-4d1c-a245-6683e14aec4c" (UID: "e6f4acd0-b524-4d1c-a245-6683e14aec4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.057314 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f4acd0-b524-4d1c-a245-6683e14aec4c-config-data" (OuterVolumeSpecName: "config-data") pod "e6f4acd0-b524-4d1c-a245-6683e14aec4c" (UID: "e6f4acd0-b524-4d1c-a245-6683e14aec4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.126657 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86p4d\" (UniqueName: \"kubernetes.io/projected/e6f4acd0-b524-4d1c-a245-6683e14aec4c-kube-api-access-86p4d\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.126705 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6f4acd0-b524-4d1c-a245-6683e14aec4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.126717 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6f4acd0-b524-4d1c-a245-6683e14aec4c-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.195585 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.203455 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.227754 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:06:28 crc kubenswrapper[4715]: E1009 08:06:28.228220 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377c1e43-3538-413a-9144-85708016acca" containerName="nova-metadata-metadata" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.228239 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="377c1e43-3538-413a-9144-85708016acca" containerName="nova-metadata-metadata" Oct 09 08:06:28 crc kubenswrapper[4715]: E1009 08:06:28.228258 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f4acd0-b524-4d1c-a245-6683e14aec4c" containerName="nova-scheduler-scheduler" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.228267 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f4acd0-b524-4d1c-a245-6683e14aec4c" containerName="nova-scheduler-scheduler" Oct 09 08:06:28 crc kubenswrapper[4715]: E1009 08:06:28.228296 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377c1e43-3538-413a-9144-85708016acca" containerName="nova-metadata-log" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.228304 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="377c1e43-3538-413a-9144-85708016acca" containerName="nova-metadata-log" Oct 09 08:06:28 crc kubenswrapper[4715]: E1009 08:06:28.228314 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c64d743-936e-460c-87d8-d0aea119fc3c" containerName="nova-manage" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.228320 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c64d743-936e-460c-87d8-d0aea119fc3c" containerName="nova-manage" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.228541 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="377c1e43-3538-413a-9144-85708016acca" containerName="nova-metadata-log" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.228568 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c64d743-936e-460c-87d8-d0aea119fc3c" containerName="nova-manage" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.228581 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f4acd0-b524-4d1c-a245-6683e14aec4c" containerName="nova-scheduler-scheduler" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.228628 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="377c1e43-3538-413a-9144-85708016acca" containerName="nova-metadata-metadata" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.230854 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.233297 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.233457 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.237081 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.334005 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76afd031-2e1d-412c-a21b-08e597e8eb83-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.334141 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76afd031-2e1d-412c-a21b-08e597e8eb83-logs\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.334371 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76afd031-2e1d-412c-a21b-08e597e8eb83-config-data\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.334538 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtj9r\" (UniqueName: \"kubernetes.io/projected/76afd031-2e1d-412c-a21b-08e597e8eb83-kube-api-access-vtj9r\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.334666 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/76afd031-2e1d-412c-a21b-08e597e8eb83-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.436230 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76afd031-2e1d-412c-a21b-08e597e8eb83-logs\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.436379 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76afd031-2e1d-412c-a21b-08e597e8eb83-config-data\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.436444 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtj9r\" (UniqueName: \"kubernetes.io/projected/76afd031-2e1d-412c-a21b-08e597e8eb83-kube-api-access-vtj9r\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.436507 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/76afd031-2e1d-412c-a21b-08e597e8eb83-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.436547 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76afd031-2e1d-412c-a21b-08e597e8eb83-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.436722 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76afd031-2e1d-412c-a21b-08e597e8eb83-logs\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.444511 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76afd031-2e1d-412c-a21b-08e597e8eb83-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.444559 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76afd031-2e1d-412c-a21b-08e597e8eb83-config-data\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.444969 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/76afd031-2e1d-412c-a21b-08e597e8eb83-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.458193 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtj9r\" (UniqueName: \"kubernetes.io/projected/76afd031-2e1d-412c-a21b-08e597e8eb83-kube-api-access-vtj9r\") pod \"nova-metadata-0\" (UID: \"76afd031-2e1d-412c-a21b-08e597e8eb83\") " pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.548920 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.888264 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e6f4acd0-b524-4d1c-a245-6683e14aec4c","Type":"ContainerDied","Data":"c6ae8ceb02c321c068fbee9480e434a44e606b4cc616d2116af9092cc444a0fd"} Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.888313 4715 scope.go:117] "RemoveContainer" containerID="424f0e7059f488402a98735f472450d172266c66091300b7c178e878de33a243" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.888403 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.925797 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.937680 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.952511 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.963330 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.967377 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 09 08:06:28 crc kubenswrapper[4715]: I1009 08:06:28.982071 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.050186 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9fdd59d-3bb3-4f3f-85af-316ddc7de166-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9fdd59d-3bb3-4f3f-85af-316ddc7de166\") " pod="openstack/nova-scheduler-0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.054683 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shs7m\" (UniqueName: \"kubernetes.io/projected/e9fdd59d-3bb3-4f3f-85af-316ddc7de166-kube-api-access-shs7m\") pod \"nova-scheduler-0\" (UID: \"e9fdd59d-3bb3-4f3f-85af-316ddc7de166\") " pod="openstack/nova-scheduler-0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.054957 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9fdd59d-3bb3-4f3f-85af-316ddc7de166-config-data\") pod \"nova-scheduler-0\" (UID: \"e9fdd59d-3bb3-4f3f-85af-316ddc7de166\") " pod="openstack/nova-scheduler-0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.067590 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.156326 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9fdd59d-3bb3-4f3f-85af-316ddc7de166-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9fdd59d-3bb3-4f3f-85af-316ddc7de166\") " pod="openstack/nova-scheduler-0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.156397 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shs7m\" (UniqueName: \"kubernetes.io/projected/e9fdd59d-3bb3-4f3f-85af-316ddc7de166-kube-api-access-shs7m\") pod \"nova-scheduler-0\" (UID: \"e9fdd59d-3bb3-4f3f-85af-316ddc7de166\") " pod="openstack/nova-scheduler-0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.156468 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9fdd59d-3bb3-4f3f-85af-316ddc7de166-config-data\") pod \"nova-scheduler-0\" (UID: \"e9fdd59d-3bb3-4f3f-85af-316ddc7de166\") " pod="openstack/nova-scheduler-0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.161153 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9fdd59d-3bb3-4f3f-85af-316ddc7de166-config-data\") pod \"nova-scheduler-0\" (UID: \"e9fdd59d-3bb3-4f3f-85af-316ddc7de166\") " pod="openstack/nova-scheduler-0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.161268 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9fdd59d-3bb3-4f3f-85af-316ddc7de166-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e9fdd59d-3bb3-4f3f-85af-316ddc7de166\") " pod="openstack/nova-scheduler-0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.174952 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shs7m\" (UniqueName: \"kubernetes.io/projected/e9fdd59d-3bb3-4f3f-85af-316ddc7de166-kube-api-access-shs7m\") pod \"nova-scheduler-0\" (UID: \"e9fdd59d-3bb3-4f3f-85af-316ddc7de166\") " pod="openstack/nova-scheduler-0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.285456 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.746045 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.870105 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.905681 4715 generic.go:334] "Generic (PLEG): container finished" podID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" containerID="c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283" exitCode=0 Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.905748 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f","Type":"ContainerDied","Data":"c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283"} Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.905780 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f","Type":"ContainerDied","Data":"28d4e0c4cae72829e61e2a88517987c4372782edbd0dd2d57cba7c39fadf7cde"} Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.905817 4715 scope.go:117] "RemoveContainer" containerID="c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.905948 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.911240 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"76afd031-2e1d-412c-a21b-08e597e8eb83","Type":"ContainerStarted","Data":"19720d37dbbf62477d67b01acfda6bd1a7def3fa9a241558b74b19cb942aca8b"} Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.911267 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"76afd031-2e1d-412c-a21b-08e597e8eb83","Type":"ContainerStarted","Data":"b69887723837775ff59e805b40a3d58fc349768e6df576d8e70d0f192eb19a22"} Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.911278 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"76afd031-2e1d-412c-a21b-08e597e8eb83","Type":"ContainerStarted","Data":"976c9afb0e6c77726adf622e11113b29cf2da7fe31a65257bb66c28155a1fcb2"} Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.914764 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9fdd59d-3bb3-4f3f-85af-316ddc7de166","Type":"ContainerStarted","Data":"8411ee7120364210dbee029790e96a93394de76f2d87ab97b137f669ef093e24"} Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.936835 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.936804059 podStartE2EDuration="1.936804059s" podCreationTimestamp="2025-10-09 08:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:06:29.932553037 +0000 UTC m=+1220.625357045" watchObservedRunningTime="2025-10-09 08:06:29.936804059 +0000 UTC m=+1220.629608067" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.939662 4715 scope.go:117] "RemoveContainer" containerID="12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.969199 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-internal-tls-certs\") pod \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.969256 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-logs\") pod \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.969594 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-config-data\") pod \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.969646 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-combined-ca-bundle\") pod \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.969746 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fs4f\" (UniqueName: \"kubernetes.io/projected/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-kube-api-access-8fs4f\") pod \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.969843 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-public-tls-certs\") pod \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\" (UID: \"65efa78c-5b6e-4ced-8aa6-c08a1e530c0f\") " Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.974985 4715 scope.go:117] "RemoveContainer" containerID="c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.977052 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-kube-api-access-8fs4f" (OuterVolumeSpecName: "kube-api-access-8fs4f") pod "65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" (UID: "65efa78c-5b6e-4ced-8aa6-c08a1e530c0f"). InnerVolumeSpecName "kube-api-access-8fs4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:06:29 crc kubenswrapper[4715]: E1009 08:06:29.977066 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283\": container with ID starting with c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283 not found: ID does not exist" containerID="c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.977156 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283"} err="failed to get container status \"c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283\": rpc error: code = NotFound desc = could not find container \"c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283\": container with ID starting with c2a6655d2fd14b05de2bad1cf2db6708b428396b86c79294ee59fb323b02f283 not found: ID does not exist" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.977193 4715 scope.go:117] "RemoveContainer" containerID="12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0" Oct 09 08:06:29 crc kubenswrapper[4715]: E1009 08:06:29.977591 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0\": container with ID starting with 12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0 not found: ID does not exist" containerID="12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.977627 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0"} err="failed to get container status \"12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0\": rpc error: code = NotFound desc = could not find container \"12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0\": container with ID starting with 12c6e7bdada916177c36c373b5c385058c1101d70d8cdbf2598ba77114044cf0 not found: ID does not exist" Oct 09 08:06:29 crc kubenswrapper[4715]: I1009 08:06:29.982046 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-logs" (OuterVolumeSpecName: "logs") pod "65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" (UID: "65efa78c-5b6e-4ced-8aa6-c08a1e530c0f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.000456 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-config-data" (OuterVolumeSpecName: "config-data") pod "65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" (UID: "65efa78c-5b6e-4ced-8aa6-c08a1e530c0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.005265 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" (UID: "65efa78c-5b6e-4ced-8aa6-c08a1e530c0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.024041 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" (UID: "65efa78c-5b6e-4ced-8aa6-c08a1e530c0f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.029885 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" (UID: "65efa78c-5b6e-4ced-8aa6-c08a1e530c0f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.072595 4715 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.072630 4715 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-logs\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.072638 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.072646 4715 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.072654 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fs4f\" (UniqueName: \"kubernetes.io/projected/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-kube-api-access-8fs4f\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.072663 4715 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.146772 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="377c1e43-3538-413a-9144-85708016acca" path="/var/lib/kubelet/pods/377c1e43-3538-413a-9144-85708016acca/volumes" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.147995 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f4acd0-b524-4d1c-a245-6683e14aec4c" path="/var/lib/kubelet/pods/e6f4acd0-b524-4d1c-a245-6683e14aec4c/volumes" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.231573 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.250563 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.272955 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 09 08:06:30 crc kubenswrapper[4715]: E1009 08:06:30.273535 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" containerName="nova-api-log" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.273559 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" containerName="nova-api-log" Oct 09 08:06:30 crc kubenswrapper[4715]: E1009 08:06:30.273596 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" containerName="nova-api-api" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.273606 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" containerName="nova-api-api" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.273816 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" containerName="nova-api-api" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.273854 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" containerName="nova-api-log" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.279110 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.286760 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.286985 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.287203 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.287356 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 09 08:06:30 crc kubenswrapper[4715]: E1009 08:06:30.351557 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65efa78c_5b6e_4ced_8aa6_c08a1e530c0f.slice\": RecentStats: unable to find data in memory cache]" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.378172 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c807bba-12d5-4e33-894d-7f1ae6faa077-logs\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.378225 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c807bba-12d5-4e33-894d-7f1ae6faa077-config-data\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.378250 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c807bba-12d5-4e33-894d-7f1ae6faa077-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.378393 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xb2r\" (UniqueName: \"kubernetes.io/projected/0c807bba-12d5-4e33-894d-7f1ae6faa077-kube-api-access-4xb2r\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.378459 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c807bba-12d5-4e33-894d-7f1ae6faa077-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.378497 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c807bba-12d5-4e33-894d-7f1ae6faa077-public-tls-certs\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.480654 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xb2r\" (UniqueName: \"kubernetes.io/projected/0c807bba-12d5-4e33-894d-7f1ae6faa077-kube-api-access-4xb2r\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.480723 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c807bba-12d5-4e33-894d-7f1ae6faa077-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.480778 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c807bba-12d5-4e33-894d-7f1ae6faa077-public-tls-certs\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.480820 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c807bba-12d5-4e33-894d-7f1ae6faa077-logs\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.480857 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c807bba-12d5-4e33-894d-7f1ae6faa077-config-data\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.480893 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c807bba-12d5-4e33-894d-7f1ae6faa077-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.481809 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c807bba-12d5-4e33-894d-7f1ae6faa077-logs\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.485515 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c807bba-12d5-4e33-894d-7f1ae6faa077-config-data\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.485956 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c807bba-12d5-4e33-894d-7f1ae6faa077-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.486194 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c807bba-12d5-4e33-894d-7f1ae6faa077-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.487627 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c807bba-12d5-4e33-894d-7f1ae6faa077-public-tls-certs\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.498210 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xb2r\" (UniqueName: \"kubernetes.io/projected/0c807bba-12d5-4e33-894d-7f1ae6faa077-kube-api-access-4xb2r\") pod \"nova-api-0\" (UID: \"0c807bba-12d5-4e33-894d-7f1ae6faa077\") " pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.661502 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.952830 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e9fdd59d-3bb3-4f3f-85af-316ddc7de166","Type":"ContainerStarted","Data":"be6dafc646acc184800cbcf9e6299d9d3016b5fe16f13517ef41a5b034d72036"} Oct 09 08:06:30 crc kubenswrapper[4715]: I1009 08:06:30.976705 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.976687868 podStartE2EDuration="2.976687868s" podCreationTimestamp="2025-10-09 08:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:06:30.972860438 +0000 UTC m=+1221.665664466" watchObservedRunningTime="2025-10-09 08:06:30.976687868 +0000 UTC m=+1221.669491866" Oct 09 08:06:31 crc kubenswrapper[4715]: I1009 08:06:31.160482 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 09 08:06:31 crc kubenswrapper[4715]: W1009 08:06:31.162744 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c807bba_12d5_4e33_894d_7f1ae6faa077.slice/crio-9b324ef5ad6aad2f5e0d651a3cf6207b4c76dad0d87b3611ac5866eaf1c0883e WatchSource:0}: Error finding container 9b324ef5ad6aad2f5e0d651a3cf6207b4c76dad0d87b3611ac5866eaf1c0883e: Status 404 returned error can't find the container with id 9b324ef5ad6aad2f5e0d651a3cf6207b4c76dad0d87b3611ac5866eaf1c0883e Oct 09 08:06:31 crc kubenswrapper[4715]: I1009 08:06:31.970375 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c807bba-12d5-4e33-894d-7f1ae6faa077","Type":"ContainerStarted","Data":"6bf659c873c156558a6c679de574e1290af15c6660e3e572d814168b53c00042"} Oct 09 08:06:31 crc kubenswrapper[4715]: I1009 08:06:31.970734 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c807bba-12d5-4e33-894d-7f1ae6faa077","Type":"ContainerStarted","Data":"dd9bcc8284522472875f219b8c4cc877a41f4e4710f029cc7ca655fbd1479afd"} Oct 09 08:06:31 crc kubenswrapper[4715]: I1009 08:06:31.970747 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c807bba-12d5-4e33-894d-7f1ae6faa077","Type":"ContainerStarted","Data":"9b324ef5ad6aad2f5e0d651a3cf6207b4c76dad0d87b3611ac5866eaf1c0883e"} Oct 09 08:06:32 crc kubenswrapper[4715]: I1009 08:06:32.155142 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65efa78c-5b6e-4ced-8aa6-c08a1e530c0f" path="/var/lib/kubelet/pods/65efa78c-5b6e-4ced-8aa6-c08a1e530c0f/volumes" Oct 09 08:06:33 crc kubenswrapper[4715]: I1009 08:06:33.549619 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 08:06:33 crc kubenswrapper[4715]: I1009 08:06:33.549727 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 09 08:06:34 crc kubenswrapper[4715]: I1009 08:06:34.286245 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 09 08:06:38 crc kubenswrapper[4715]: I1009 08:06:38.549609 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 08:06:38 crc kubenswrapper[4715]: I1009 08:06:38.550137 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 09 08:06:39 crc kubenswrapper[4715]: I1009 08:06:39.285732 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 09 08:06:39 crc kubenswrapper[4715]: I1009 08:06:39.316094 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 09 08:06:39 crc kubenswrapper[4715]: I1009 08:06:39.339166 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=9.339148846 podStartE2EDuration="9.339148846s" podCreationTimestamp="2025-10-09 08:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:06:31.998377013 +0000 UTC m=+1222.691181021" watchObservedRunningTime="2025-10-09 08:06:39.339148846 +0000 UTC m=+1230.031952854" Oct 09 08:06:39 crc kubenswrapper[4715]: I1009 08:06:39.565575 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="76afd031-2e1d-412c-a21b-08e597e8eb83" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 08:06:39 crc kubenswrapper[4715]: I1009 08:06:39.565717 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="76afd031-2e1d-412c-a21b-08e597e8eb83" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 09 08:06:40 crc kubenswrapper[4715]: I1009 08:06:40.083079 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 09 08:06:40 crc kubenswrapper[4715]: I1009 08:06:40.661759 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 08:06:40 crc kubenswrapper[4715]: I1009 08:06:40.662142 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 09 08:06:41 crc kubenswrapper[4715]: I1009 08:06:41.678626 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c807bba-12d5-4e33-894d-7f1ae6faa077" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 08:06:41 crc kubenswrapper[4715]: I1009 08:06:41.679201 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c807bba-12d5-4e33-894d-7f1ae6faa077" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 09 08:06:46 crc kubenswrapper[4715]: I1009 08:06:46.754123 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:06:46 crc kubenswrapper[4715]: I1009 08:06:46.754739 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:06:46 crc kubenswrapper[4715]: I1009 08:06:46.754787 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 08:06:46 crc kubenswrapper[4715]: I1009 08:06:46.755727 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d50fe5031fb9148bf6f3fa221f298d39e3a132b33b247b66e3d4fb59f3f0c771"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 08:06:46 crc kubenswrapper[4715]: I1009 08:06:46.755795 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://d50fe5031fb9148bf6f3fa221f298d39e3a132b33b247b66e3d4fb59f3f0c771" gracePeriod=600 Oct 09 08:06:47 crc kubenswrapper[4715]: I1009 08:06:47.147490 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="d50fe5031fb9148bf6f3fa221f298d39e3a132b33b247b66e3d4fb59f3f0c771" exitCode=0 Oct 09 08:06:47 crc kubenswrapper[4715]: I1009 08:06:47.147563 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"d50fe5031fb9148bf6f3fa221f298d39e3a132b33b247b66e3d4fb59f3f0c771"} Oct 09 08:06:47 crc kubenswrapper[4715]: I1009 08:06:47.147815 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"c420b2a8eda30fb3123058bb99d74df66c4ad029fca95601a27c380e6d5834c9"} Oct 09 08:06:47 crc kubenswrapper[4715]: I1009 08:06:47.147833 4715 scope.go:117] "RemoveContainer" containerID="d09013bd08005ad32fff769feb782bd4aaed730f81b53c0815c16cfe4fba1a84" Oct 09 08:06:47 crc kubenswrapper[4715]: I1009 08:06:47.406034 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 09 08:06:48 crc kubenswrapper[4715]: I1009 08:06:48.555555 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 08:06:48 crc kubenswrapper[4715]: I1009 08:06:48.555876 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 09 08:06:48 crc kubenswrapper[4715]: I1009 08:06:48.566655 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 08:06:48 crc kubenswrapper[4715]: I1009 08:06:48.567240 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 09 08:06:50 crc kubenswrapper[4715]: I1009 08:06:50.669980 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 08:06:50 crc kubenswrapper[4715]: I1009 08:06:50.670967 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 08:06:50 crc kubenswrapper[4715]: I1009 08:06:50.673673 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 09 08:06:50 crc kubenswrapper[4715]: I1009 08:06:50.677992 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 08:06:51 crc kubenswrapper[4715]: I1009 08:06:51.201495 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 09 08:06:51 crc kubenswrapper[4715]: I1009 08:06:51.210344 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 09 08:06:59 crc kubenswrapper[4715]: I1009 08:06:59.625588 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 08:07:00 crc kubenswrapper[4715]: I1009 08:07:00.893289 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 08:07:03 crc kubenswrapper[4715]: I1009 08:07:03.678345 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="1673772c-a772-4ad8-85c3-f68268965d4b" containerName="rabbitmq" containerID="cri-o://96111493a6fbd5d69aad7e7bb948021003031322bc366f71e9e97e9d2a638bcc" gracePeriod=604796 Oct 09 08:07:04 crc kubenswrapper[4715]: I1009 08:07:04.625345 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a4714af0-14ef-4513-ac5e-dbf4aa99079b" containerName="rabbitmq" containerID="cri-o://a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f" gracePeriod=604797 Oct 09 08:07:06 crc kubenswrapper[4715]: I1009 08:07:06.686336 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="1673772c-a772-4ad8-85c3-f68268965d4b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Oct 09 08:07:07 crc kubenswrapper[4715]: I1009 08:07:07.008065 4715 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a4714af0-14ef-4513-ac5e-dbf4aa99079b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.420130 4715 generic.go:334] "Generic (PLEG): container finished" podID="1673772c-a772-4ad8-85c3-f68268965d4b" containerID="96111493a6fbd5d69aad7e7bb948021003031322bc366f71e9e97e9d2a638bcc" exitCode=0 Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.420207 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1673772c-a772-4ad8-85c3-f68268965d4b","Type":"ContainerDied","Data":"96111493a6fbd5d69aad7e7bb948021003031322bc366f71e9e97e9d2a638bcc"} Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.420495 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1673772c-a772-4ad8-85c3-f68268965d4b","Type":"ContainerDied","Data":"addb4771dd9794d781c5dd544b89a7b8ca72472906966daf2fce113e5102679f"} Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.420510 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="addb4771dd9794d781c5dd544b89a7b8ca72472906966daf2fce113e5102679f" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.457188 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.567097 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-erlang-cookie\") pod \"1673772c-a772-4ad8-85c3-f68268965d4b\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.567161 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-plugins-conf\") pod \"1673772c-a772-4ad8-85c3-f68268965d4b\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.567189 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"1673772c-a772-4ad8-85c3-f68268965d4b\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.567219 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-tls\") pod \"1673772c-a772-4ad8-85c3-f68268965d4b\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.567318 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-config-data\") pod \"1673772c-a772-4ad8-85c3-f68268965d4b\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.567345 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-plugins\") pod \"1673772c-a772-4ad8-85c3-f68268965d4b\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.567368 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9dzr\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-kube-api-access-w9dzr\") pod \"1673772c-a772-4ad8-85c3-f68268965d4b\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.567393 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-server-conf\") pod \"1673772c-a772-4ad8-85c3-f68268965d4b\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.567466 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1673772c-a772-4ad8-85c3-f68268965d4b-erlang-cookie-secret\") pod \"1673772c-a772-4ad8-85c3-f68268965d4b\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.567508 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-confd\") pod \"1673772c-a772-4ad8-85c3-f68268965d4b\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.567584 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1673772c-a772-4ad8-85c3-f68268965d4b-pod-info\") pod \"1673772c-a772-4ad8-85c3-f68268965d4b\" (UID: \"1673772c-a772-4ad8-85c3-f68268965d4b\") " Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.568065 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1673772c-a772-4ad8-85c3-f68268965d4b" (UID: "1673772c-a772-4ad8-85c3-f68268965d4b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.569554 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1673772c-a772-4ad8-85c3-f68268965d4b" (UID: "1673772c-a772-4ad8-85c3-f68268965d4b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.570111 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1673772c-a772-4ad8-85c3-f68268965d4b" (UID: "1673772c-a772-4ad8-85c3-f68268965d4b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.578636 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1673772c-a772-4ad8-85c3-f68268965d4b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1673772c-a772-4ad8-85c3-f68268965d4b" (UID: "1673772c-a772-4ad8-85c3-f68268965d4b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.581585 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-kube-api-access-w9dzr" (OuterVolumeSpecName: "kube-api-access-w9dzr") pod "1673772c-a772-4ad8-85c3-f68268965d4b" (UID: "1673772c-a772-4ad8-85c3-f68268965d4b"). InnerVolumeSpecName "kube-api-access-w9dzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.593213 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "1673772c-a772-4ad8-85c3-f68268965d4b" (UID: "1673772c-a772-4ad8-85c3-f68268965d4b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.594742 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1673772c-a772-4ad8-85c3-f68268965d4b" (UID: "1673772c-a772-4ad8-85c3-f68268965d4b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.597506 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1673772c-a772-4ad8-85c3-f68268965d4b-pod-info" (OuterVolumeSpecName: "pod-info") pod "1673772c-a772-4ad8-85c3-f68268965d4b" (UID: "1673772c-a772-4ad8-85c3-f68268965d4b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.600930 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-config-data" (OuterVolumeSpecName: "config-data") pod "1673772c-a772-4ad8-85c3-f68268965d4b" (UID: "1673772c-a772-4ad8-85c3-f68268965d4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.637221 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-server-conf" (OuterVolumeSpecName: "server-conf") pod "1673772c-a772-4ad8-85c3-f68268965d4b" (UID: "1673772c-a772-4ad8-85c3-f68268965d4b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.672472 4715 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.672612 4715 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.672652 4715 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.672724 4715 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.672739 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.672750 4715 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.672761 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9dzr\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-kube-api-access-w9dzr\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.672785 4715 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1673772c-a772-4ad8-85c3-f68268965d4b-server-conf\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.672795 4715 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1673772c-a772-4ad8-85c3-f68268965d4b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.672806 4715 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1673772c-a772-4ad8-85c3-f68268965d4b-pod-info\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.701138 4715 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.721993 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1673772c-a772-4ad8-85c3-f68268965d4b" (UID: "1673772c-a772-4ad8-85c3-f68268965d4b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.773985 4715 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1673772c-a772-4ad8-85c3-f68268965d4b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:10 crc kubenswrapper[4715]: I1009 08:07:10.774012 4715 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.223788 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.282107 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcmtl\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-kube-api-access-dcmtl\") pod \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.282162 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a4714af0-14ef-4513-ac5e-dbf4aa99079b-pod-info\") pod \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.282226 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a4714af0-14ef-4513-ac5e-dbf4aa99079b-erlang-cookie-secret\") pod \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.282263 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-erlang-cookie\") pod \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.282289 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-plugins-conf\") pod \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.282325 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-plugins\") pod \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.282372 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-config-data\") pod \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.282412 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-confd\") pod \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.282532 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.282578 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-server-conf\") pod \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.282599 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-tls\") pod \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\" (UID: \"a4714af0-14ef-4513-ac5e-dbf4aa99079b\") " Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.286050 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a4714af0-14ef-4513-ac5e-dbf4aa99079b" (UID: "a4714af0-14ef-4513-ac5e-dbf4aa99079b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.289750 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a4714af0-14ef-4513-ac5e-dbf4aa99079b" (UID: "a4714af0-14ef-4513-ac5e-dbf4aa99079b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.290911 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4714af0-14ef-4513-ac5e-dbf4aa99079b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a4714af0-14ef-4513-ac5e-dbf4aa99079b" (UID: "a4714af0-14ef-4513-ac5e-dbf4aa99079b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.290948 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a4714af0-14ef-4513-ac5e-dbf4aa99079b" (UID: "a4714af0-14ef-4513-ac5e-dbf4aa99079b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.313665 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "a4714af0-14ef-4513-ac5e-dbf4aa99079b" (UID: "a4714af0-14ef-4513-ac5e-dbf4aa99079b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.313713 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-kube-api-access-dcmtl" (OuterVolumeSpecName: "kube-api-access-dcmtl") pod "a4714af0-14ef-4513-ac5e-dbf4aa99079b" (UID: "a4714af0-14ef-4513-ac5e-dbf4aa99079b"). InnerVolumeSpecName "kube-api-access-dcmtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.314661 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a4714af0-14ef-4513-ac5e-dbf4aa99079b" (UID: "a4714af0-14ef-4513-ac5e-dbf4aa99079b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.319923 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a4714af0-14ef-4513-ac5e-dbf4aa99079b-pod-info" (OuterVolumeSpecName: "pod-info") pod "a4714af0-14ef-4513-ac5e-dbf4aa99079b" (UID: "a4714af0-14ef-4513-ac5e-dbf4aa99079b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.349532 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-config-data" (OuterVolumeSpecName: "config-data") pod "a4714af0-14ef-4513-ac5e-dbf4aa99079b" (UID: "a4714af0-14ef-4513-ac5e-dbf4aa99079b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.369451 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-server-conf" (OuterVolumeSpecName: "server-conf") pod "a4714af0-14ef-4513-ac5e-dbf4aa99079b" (UID: "a4714af0-14ef-4513-ac5e-dbf4aa99079b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.385470 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcmtl\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-kube-api-access-dcmtl\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.385520 4715 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a4714af0-14ef-4513-ac5e-dbf4aa99079b-pod-info\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.385536 4715 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a4714af0-14ef-4513-ac5e-dbf4aa99079b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.385550 4715 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.385560 4715 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.385568 4715 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.385578 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.385621 4715 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.385632 4715 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a4714af0-14ef-4513-ac5e-dbf4aa99079b-server-conf\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.385662 4715 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.415405 4715 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.460179 4715 generic.go:334] "Generic (PLEG): container finished" podID="a4714af0-14ef-4513-ac5e-dbf4aa99079b" containerID="a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f" exitCode=0 Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.460284 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.468716 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a4714af0-14ef-4513-ac5e-dbf4aa99079b","Type":"ContainerDied","Data":"a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f"} Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.468754 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.468851 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a4714af0-14ef-4513-ac5e-dbf4aa99079b","Type":"ContainerDied","Data":"7acd953faef5f022130324c5e0cc1ba1395483711566f6744e2ae956d91a44c7"} Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.468874 4715 scope.go:117] "RemoveContainer" containerID="a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.489960 4715 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.523115 4715 scope.go:117] "RemoveContainer" containerID="1bd128cd2c654c89d405e27fdb17e74880b73ef2fcd7b473522b9790ccdfeb29" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.534246 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a4714af0-14ef-4513-ac5e-dbf4aa99079b" (UID: "a4714af0-14ef-4513-ac5e-dbf4aa99079b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.539131 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.553753 4715 scope.go:117] "RemoveContainer" containerID="a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f" Oct 09 08:07:11 crc kubenswrapper[4715]: E1009 08:07:11.554274 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f\": container with ID starting with a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f not found: ID does not exist" containerID="a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.554316 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f"} err="failed to get container status \"a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f\": rpc error: code = NotFound desc = could not find container \"a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f\": container with ID starting with a63af1c17453b501a884b7fa445b704d3f526af04c984c5025876fa24104160f not found: ID does not exist" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.554343 4715 scope.go:117] "RemoveContainer" containerID="1bd128cd2c654c89d405e27fdb17e74880b73ef2fcd7b473522b9790ccdfeb29" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.554476 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 08:07:11 crc kubenswrapper[4715]: E1009 08:07:11.557728 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd128cd2c654c89d405e27fdb17e74880b73ef2fcd7b473522b9790ccdfeb29\": container with ID starting with 1bd128cd2c654c89d405e27fdb17e74880b73ef2fcd7b473522b9790ccdfeb29 not found: ID does not exist" containerID="1bd128cd2c654c89d405e27fdb17e74880b73ef2fcd7b473522b9790ccdfeb29" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.557771 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd128cd2c654c89d405e27fdb17e74880b73ef2fcd7b473522b9790ccdfeb29"} err="failed to get container status \"1bd128cd2c654c89d405e27fdb17e74880b73ef2fcd7b473522b9790ccdfeb29\": rpc error: code = NotFound desc = could not find container \"1bd128cd2c654c89d405e27fdb17e74880b73ef2fcd7b473522b9790ccdfeb29\": container with ID starting with 1bd128cd2c654c89d405e27fdb17e74880b73ef2fcd7b473522b9790ccdfeb29 not found: ID does not exist" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.566553 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 08:07:11 crc kubenswrapper[4715]: E1009 08:07:11.566974 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1673772c-a772-4ad8-85c3-f68268965d4b" containerName="rabbitmq" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.566988 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1673772c-a772-4ad8-85c3-f68268965d4b" containerName="rabbitmq" Oct 09 08:07:11 crc kubenswrapper[4715]: E1009 08:07:11.567004 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4714af0-14ef-4513-ac5e-dbf4aa99079b" containerName="rabbitmq" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.567010 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4714af0-14ef-4513-ac5e-dbf4aa99079b" containerName="rabbitmq" Oct 09 08:07:11 crc kubenswrapper[4715]: E1009 08:07:11.567029 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1673772c-a772-4ad8-85c3-f68268965d4b" containerName="setup-container" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.567036 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1673772c-a772-4ad8-85c3-f68268965d4b" containerName="setup-container" Oct 09 08:07:11 crc kubenswrapper[4715]: E1009 08:07:11.567044 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4714af0-14ef-4513-ac5e-dbf4aa99079b" containerName="setup-container" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.567050 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4714af0-14ef-4513-ac5e-dbf4aa99079b" containerName="setup-container" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.567234 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1673772c-a772-4ad8-85c3-f68268965d4b" containerName="rabbitmq" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.567254 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4714af0-14ef-4513-ac5e-dbf4aa99079b" containerName="rabbitmq" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.568284 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.572068 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.575891 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.575910 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.575913 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.576291 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-pghpr" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.576510 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.576662 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.581017 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.592564 4715 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a4714af0-14ef-4513-ac5e-dbf4aa99079b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.694397 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c87001f3-a098-449a-b8ec-cccb2a313d5f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.694511 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c87001f3-a098-449a-b8ec-cccb2a313d5f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.694578 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c87001f3-a098-449a-b8ec-cccb2a313d5f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.694607 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c87001f3-a098-449a-b8ec-cccb2a313d5f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.694645 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c87001f3-a098-449a-b8ec-cccb2a313d5f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.694678 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c87001f3-a098-449a-b8ec-cccb2a313d5f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.694706 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c87001f3-a098-449a-b8ec-cccb2a313d5f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.694746 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.694767 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6ldv\" (UniqueName: \"kubernetes.io/projected/c87001f3-a098-449a-b8ec-cccb2a313d5f-kube-api-access-z6ldv\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.694792 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c87001f3-a098-449a-b8ec-cccb2a313d5f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.694830 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c87001f3-a098-449a-b8ec-cccb2a313d5f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.796312 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c87001f3-a098-449a-b8ec-cccb2a313d5f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.796354 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c87001f3-a098-449a-b8ec-cccb2a313d5f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.796379 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c87001f3-a098-449a-b8ec-cccb2a313d5f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.796408 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c87001f3-a098-449a-b8ec-cccb2a313d5f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.796445 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c87001f3-a098-449a-b8ec-cccb2a313d5f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.796488 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.796513 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6ldv\" (UniqueName: \"kubernetes.io/projected/c87001f3-a098-449a-b8ec-cccb2a313d5f-kube-api-access-z6ldv\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.796531 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c87001f3-a098-449a-b8ec-cccb2a313d5f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.796565 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c87001f3-a098-449a-b8ec-cccb2a313d5f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.796599 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c87001f3-a098-449a-b8ec-cccb2a313d5f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.796627 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c87001f3-a098-449a-b8ec-cccb2a313d5f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.805515 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c87001f3-a098-449a-b8ec-cccb2a313d5f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.811528 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c87001f3-a098-449a-b8ec-cccb2a313d5f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.811752 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c87001f3-a098-449a-b8ec-cccb2a313d5f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.812026 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.812439 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c87001f3-a098-449a-b8ec-cccb2a313d5f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.812719 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c87001f3-a098-449a-b8ec-cccb2a313d5f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.814606 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c87001f3-a098-449a-b8ec-cccb2a313d5f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.815254 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c87001f3-a098-449a-b8ec-cccb2a313d5f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.820014 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c87001f3-a098-449a-b8ec-cccb2a313d5f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.843182 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.844665 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c87001f3-a098-449a-b8ec-cccb2a313d5f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.850999 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6ldv\" (UniqueName: \"kubernetes.io/projected/c87001f3-a098-449a-b8ec-cccb2a313d5f-kube-api-access-z6ldv\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.878877 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"c87001f3-a098-449a-b8ec-cccb2a313d5f\") " pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.889164 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.898516 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.900773 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.901184 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.904618 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.905030 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.905192 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.905306 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9x8lg" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.905440 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.905479 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.914480 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 09 08:07:11 crc kubenswrapper[4715]: I1009 08:07:11.925768 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.000002 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.000114 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.000144 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.000168 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jm2g\" (UniqueName: \"kubernetes.io/projected/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-kube-api-access-2jm2g\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.000218 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.000242 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.000311 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.000338 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.000382 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.000458 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.000489 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.102538 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.102942 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.102964 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.102987 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jm2g\" (UniqueName: \"kubernetes.io/projected/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-kube-api-access-2jm2g\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.103035 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.103059 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.103139 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.103167 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.103224 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.103289 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.103328 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.103785 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.104107 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.104966 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.105039 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.105277 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.105796 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.107276 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.107844 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.108041 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.121241 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.132187 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jm2g\" (UniqueName: \"kubernetes.io/projected/a2bc3ad0-34e4-4ccc-9abd-7e998940780c-kube-api-access-2jm2g\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.139348 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2bc3ad0-34e4-4ccc-9abd-7e998940780c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.152657 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1673772c-a772-4ad8-85c3-f68268965d4b" path="/var/lib/kubelet/pods/1673772c-a772-4ad8-85c3-f68268965d4b/volumes" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.153399 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4714af0-14ef-4513-ac5e-dbf4aa99079b" path="/var/lib/kubelet/pods/a4714af0-14ef-4513-ac5e-dbf4aa99079b/volumes" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.329290 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.390551 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 09 08:07:12 crc kubenswrapper[4715]: W1009 08:07:12.396224 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc87001f3_a098_449a_b8ec_cccb2a313d5f.slice/crio-995ee7bbf78dc78facebb661d0cdd156315c60e04dc9c9bed780cf8713ad9424 WatchSource:0}: Error finding container 995ee7bbf78dc78facebb661d0cdd156315c60e04dc9c9bed780cf8713ad9424: Status 404 returned error can't find the container with id 995ee7bbf78dc78facebb661d0cdd156315c60e04dc9c9bed780cf8713ad9424 Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.493876 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c87001f3-a098-449a-b8ec-cccb2a313d5f","Type":"ContainerStarted","Data":"995ee7bbf78dc78facebb661d0cdd156315c60e04dc9c9bed780cf8713ad9424"} Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.780631 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-ffddk"] Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.782100 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.788384 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.798504 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-ffddk"] Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.819177 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.819239 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-config\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.819268 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.819453 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.819635 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-dns-svc\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.819823 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.820038 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-282b8\" (UniqueName: \"kubernetes.io/projected/88b06296-0f80-4890-bae8-ff609dbbc6be-kube-api-access-282b8\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.853091 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.921035 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-282b8\" (UniqueName: \"kubernetes.io/projected/88b06296-0f80-4890-bae8-ff609dbbc6be-kube-api-access-282b8\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.921300 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.921331 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-config\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.921353 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.921386 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.921435 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-dns-svc\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.921464 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.922321 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.922456 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.922491 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.922620 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-config\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.922638 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.923021 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-dns-svc\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.928333 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-ffddk"] Oct 09 08:07:12 crc kubenswrapper[4715]: E1009 08:07:12.929196 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-282b8], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-67b789f86c-ffddk" podUID="88b06296-0f80-4890-bae8-ff609dbbc6be" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.947393 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-282b8\" (UniqueName: \"kubernetes.io/projected/88b06296-0f80-4890-bae8-ff609dbbc6be-kube-api-access-282b8\") pod \"dnsmasq-dns-67b789f86c-ffddk\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.960587 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-6clgv"] Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.962169 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:12 crc kubenswrapper[4715]: I1009 08:07:12.986460 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-6clgv"] Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.022929 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.022976 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.023022 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-config\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.023054 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.023115 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.023155 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjhk2\" (UniqueName: \"kubernetes.io/projected/9d37f947-6e34-45b8-96a5-a18465d3f3fd-kube-api-access-zjhk2\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.023186 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.125127 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.125175 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.125216 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-config\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.125279 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.126247 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.126253 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-config\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.126314 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.126597 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.126647 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.127045 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.127154 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhk2\" (UniqueName: \"kubernetes.io/projected/9d37f947-6e34-45b8-96a5-a18465d3f3fd-kube-api-access-zjhk2\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.127204 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.127952 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d37f947-6e34-45b8-96a5-a18465d3f3fd-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.144122 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhk2\" (UniqueName: \"kubernetes.io/projected/9d37f947-6e34-45b8-96a5-a18465d3f3fd-kube-api-access-zjhk2\") pod \"dnsmasq-dns-cb6ffcf87-6clgv\" (UID: \"9d37f947-6e34-45b8-96a5-a18465d3f3fd\") " pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.371791 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.508105 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2bc3ad0-34e4-4ccc-9abd-7e998940780c","Type":"ContainerStarted","Data":"7cad8bf6b9309312455ad120dfb0403bc43c69acf28eafffd59895cb3fe2029c"} Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.508148 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.844122 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.946328 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-282b8\" (UniqueName: \"kubernetes.io/projected/88b06296-0f80-4890-bae8-ff609dbbc6be-kube-api-access-282b8\") pod \"88b06296-0f80-4890-bae8-ff609dbbc6be\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.946469 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-ovsdbserver-sb\") pod \"88b06296-0f80-4890-bae8-ff609dbbc6be\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.946501 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-dns-swift-storage-0\") pod \"88b06296-0f80-4890-bae8-ff609dbbc6be\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.946552 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-dns-svc\") pod \"88b06296-0f80-4890-bae8-ff609dbbc6be\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.946601 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-config\") pod \"88b06296-0f80-4890-bae8-ff609dbbc6be\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.947009 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "88b06296-0f80-4890-bae8-ff609dbbc6be" (UID: "88b06296-0f80-4890-bae8-ff609dbbc6be"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.947028 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "88b06296-0f80-4890-bae8-ff609dbbc6be" (UID: "88b06296-0f80-4890-bae8-ff609dbbc6be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.947019 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "88b06296-0f80-4890-bae8-ff609dbbc6be" (UID: "88b06296-0f80-4890-bae8-ff609dbbc6be"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.947075 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-ovsdbserver-nb\") pod \"88b06296-0f80-4890-bae8-ff609dbbc6be\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.947238 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-openstack-edpm-ipam\") pod \"88b06296-0f80-4890-bae8-ff609dbbc6be\" (UID: \"88b06296-0f80-4890-bae8-ff609dbbc6be\") " Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.947590 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-config" (OuterVolumeSpecName: "config") pod "88b06296-0f80-4890-bae8-ff609dbbc6be" (UID: "88b06296-0f80-4890-bae8-ff609dbbc6be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.947668 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "88b06296-0f80-4890-bae8-ff609dbbc6be" (UID: "88b06296-0f80-4890-bae8-ff609dbbc6be"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.947758 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "88b06296-0f80-4890-bae8-ff609dbbc6be" (UID: "88b06296-0f80-4890-bae8-ff609dbbc6be"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.948189 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.948206 4715 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.948215 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.948225 4715 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.948234 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.948243 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88b06296-0f80-4890-bae8-ff609dbbc6be-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:13 crc kubenswrapper[4715]: I1009 08:07:13.951975 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b06296-0f80-4890-bae8-ff609dbbc6be-kube-api-access-282b8" (OuterVolumeSpecName: "kube-api-access-282b8") pod "88b06296-0f80-4890-bae8-ff609dbbc6be" (UID: "88b06296-0f80-4890-bae8-ff609dbbc6be"). InnerVolumeSpecName "kube-api-access-282b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:07:14 crc kubenswrapper[4715]: I1009 08:07:14.032266 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-6clgv"] Oct 09 08:07:14 crc kubenswrapper[4715]: W1009 08:07:14.035978 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d37f947_6e34_45b8_96a5_a18465d3f3fd.slice/crio-26df1e2a9a619e26a3b9120152fe71efc3c6870209748eb34d7cc295e8605970 WatchSource:0}: Error finding container 26df1e2a9a619e26a3b9120152fe71efc3c6870209748eb34d7cc295e8605970: Status 404 returned error can't find the container with id 26df1e2a9a619e26a3b9120152fe71efc3c6870209748eb34d7cc295e8605970 Oct 09 08:07:14 crc kubenswrapper[4715]: I1009 08:07:14.049631 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-282b8\" (UniqueName: \"kubernetes.io/projected/88b06296-0f80-4890-bae8-ff609dbbc6be-kube-api-access-282b8\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:14 crc kubenswrapper[4715]: I1009 08:07:14.523118 4715 generic.go:334] "Generic (PLEG): container finished" podID="9d37f947-6e34-45b8-96a5-a18465d3f3fd" containerID="c5ced9e72af0574cbada097b0eec45c10bc48628f1deb0ce0d5e7fba68a7e159" exitCode=0 Oct 09 08:07:14 crc kubenswrapper[4715]: I1009 08:07:14.523298 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" event={"ID":"9d37f947-6e34-45b8-96a5-a18465d3f3fd","Type":"ContainerDied","Data":"c5ced9e72af0574cbada097b0eec45c10bc48628f1deb0ce0d5e7fba68a7e159"} Oct 09 08:07:14 crc kubenswrapper[4715]: I1009 08:07:14.523630 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" event={"ID":"9d37f947-6e34-45b8-96a5-a18465d3f3fd","Type":"ContainerStarted","Data":"26df1e2a9a619e26a3b9120152fe71efc3c6870209748eb34d7cc295e8605970"} Oct 09 08:07:14 crc kubenswrapper[4715]: I1009 08:07:14.525809 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-ffddk" Oct 09 08:07:14 crc kubenswrapper[4715]: I1009 08:07:14.526724 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c87001f3-a098-449a-b8ec-cccb2a313d5f","Type":"ContainerStarted","Data":"afbbd5bab5df112380ceb335943cb58777df2cd387766ddc6422d7144cbfaeef"} Oct 09 08:07:14 crc kubenswrapper[4715]: I1009 08:07:14.644678 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-ffddk"] Oct 09 08:07:14 crc kubenswrapper[4715]: I1009 08:07:14.655284 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-ffddk"] Oct 09 08:07:15 crc kubenswrapper[4715]: I1009 08:07:15.539045 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" event={"ID":"9d37f947-6e34-45b8-96a5-a18465d3f3fd","Type":"ContainerStarted","Data":"4d04b1b16e9cf849964238509e97d7466956df7a5c912d69f24e193267d05d02"} Oct 09 08:07:15 crc kubenswrapper[4715]: I1009 08:07:15.539223 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:15 crc kubenswrapper[4715]: I1009 08:07:15.542352 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2bc3ad0-34e4-4ccc-9abd-7e998940780c","Type":"ContainerStarted","Data":"5ad32f351c006f9cf024eb67e2c83c0b79a11d3c7974f856cf621da2c2b592e8"} Oct 09 08:07:15 crc kubenswrapper[4715]: I1009 08:07:15.572072 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" podStartSLOduration=3.572047299 podStartE2EDuration="3.572047299s" podCreationTimestamp="2025-10-09 08:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:07:15.566125952 +0000 UTC m=+1266.258929990" watchObservedRunningTime="2025-10-09 08:07:15.572047299 +0000 UTC m=+1266.264851337" Oct 09 08:07:16 crc kubenswrapper[4715]: I1009 08:07:16.146994 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b06296-0f80-4890-bae8-ff609dbbc6be" path="/var/lib/kubelet/pods/88b06296-0f80-4890-bae8-ff609dbbc6be/volumes" Oct 09 08:07:23 crc kubenswrapper[4715]: I1009 08:07:23.373671 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-6clgv" Oct 09 08:07:23 crc kubenswrapper[4715]: I1009 08:07:23.456052 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gf67l"] Oct 09 08:07:23 crc kubenswrapper[4715]: I1009 08:07:23.456281 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" podUID="07d5a57f-6ce3-4572-849c-baebf00831f1" containerName="dnsmasq-dns" containerID="cri-o://84a8c94e7032c9daa84e7b4f56bed17e716c809e6e95ce1c341e0de3785cfad5" gracePeriod=10 Oct 09 08:07:23 crc kubenswrapper[4715]: I1009 08:07:23.624878 4715 generic.go:334] "Generic (PLEG): container finished" podID="07d5a57f-6ce3-4572-849c-baebf00831f1" containerID="84a8c94e7032c9daa84e7b4f56bed17e716c809e6e95ce1c341e0de3785cfad5" exitCode=0 Oct 09 08:07:23 crc kubenswrapper[4715]: I1009 08:07:23.624918 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" event={"ID":"07d5a57f-6ce3-4572-849c-baebf00831f1","Type":"ContainerDied","Data":"84a8c94e7032c9daa84e7b4f56bed17e716c809e6e95ce1c341e0de3785cfad5"} Oct 09 08:07:23 crc kubenswrapper[4715]: I1009 08:07:23.924862 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.046010 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-ovsdbserver-sb\") pod \"07d5a57f-6ce3-4572-849c-baebf00831f1\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.046054 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-config\") pod \"07d5a57f-6ce3-4572-849c-baebf00831f1\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.046097 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-dns-svc\") pod \"07d5a57f-6ce3-4572-849c-baebf00831f1\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.046176 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trb5t\" (UniqueName: \"kubernetes.io/projected/07d5a57f-6ce3-4572-849c-baebf00831f1-kube-api-access-trb5t\") pod \"07d5a57f-6ce3-4572-849c-baebf00831f1\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.046203 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-dns-swift-storage-0\") pod \"07d5a57f-6ce3-4572-849c-baebf00831f1\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.047016 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-ovsdbserver-nb\") pod \"07d5a57f-6ce3-4572-849c-baebf00831f1\" (UID: \"07d5a57f-6ce3-4572-849c-baebf00831f1\") " Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.052660 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d5a57f-6ce3-4572-849c-baebf00831f1-kube-api-access-trb5t" (OuterVolumeSpecName: "kube-api-access-trb5t") pod "07d5a57f-6ce3-4572-849c-baebf00831f1" (UID: "07d5a57f-6ce3-4572-849c-baebf00831f1"). InnerVolumeSpecName "kube-api-access-trb5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.097016 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-config" (OuterVolumeSpecName: "config") pod "07d5a57f-6ce3-4572-849c-baebf00831f1" (UID: "07d5a57f-6ce3-4572-849c-baebf00831f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.116478 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07d5a57f-6ce3-4572-849c-baebf00831f1" (UID: "07d5a57f-6ce3-4572-849c-baebf00831f1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.118589 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "07d5a57f-6ce3-4572-849c-baebf00831f1" (UID: "07d5a57f-6ce3-4572-849c-baebf00831f1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.122893 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07d5a57f-6ce3-4572-849c-baebf00831f1" (UID: "07d5a57f-6ce3-4572-849c-baebf00831f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.124267 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07d5a57f-6ce3-4572-849c-baebf00831f1" (UID: "07d5a57f-6ce3-4572-849c-baebf00831f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.148869 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.148894 4715 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.148902 4715 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.148911 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trb5t\" (UniqueName: \"kubernetes.io/projected/07d5a57f-6ce3-4572-849c-baebf00831f1-kube-api-access-trb5t\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.148921 4715 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.148930 4715 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07d5a57f-6ce3-4572-849c-baebf00831f1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.635526 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" event={"ID":"07d5a57f-6ce3-4572-849c-baebf00831f1","Type":"ContainerDied","Data":"a11ff4ed1a429d435aafe8d26f57fdf83a212f402bcc7adebbb6f3df5bf53a32"} Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.635831 4715 scope.go:117] "RemoveContainer" containerID="84a8c94e7032c9daa84e7b4f56bed17e716c809e6e95ce1c341e0de3785cfad5" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.635614 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-gf67l" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.660083 4715 scope.go:117] "RemoveContainer" containerID="8969294b93cd32dc814e9f8666314f7fad7090821c8773994867ebe225fb5b82" Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.666753 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gf67l"] Oct 09 08:07:24 crc kubenswrapper[4715]: I1009 08:07:24.675178 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gf67l"] Oct 09 08:07:26 crc kubenswrapper[4715]: I1009 08:07:26.148129 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d5a57f-6ce3-4572-849c-baebf00831f1" path="/var/lib/kubelet/pods/07d5a57f-6ce3-4572-849c-baebf00831f1/volumes" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.935016 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc"] Oct 09 08:07:31 crc kubenswrapper[4715]: E1009 08:07:31.936072 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d5a57f-6ce3-4572-849c-baebf00831f1" containerName="init" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.936090 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d5a57f-6ce3-4572-849c-baebf00831f1" containerName="init" Oct 09 08:07:31 crc kubenswrapper[4715]: E1009 08:07:31.936116 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d5a57f-6ce3-4572-849c-baebf00831f1" containerName="dnsmasq-dns" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.936124 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d5a57f-6ce3-4572-849c-baebf00831f1" containerName="dnsmasq-dns" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.936356 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d5a57f-6ce3-4572-849c-baebf00831f1" containerName="dnsmasq-dns" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.937054 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.939900 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.940114 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.940275 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.940687 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.944263 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc"] Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.993982 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.994034 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncvxr\" (UniqueName: \"kubernetes.io/projected/1058ab3e-4f39-48ac-9f7e-81e40f041264-kube-api-access-ncvxr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.994106 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:31 crc kubenswrapper[4715]: I1009 08:07:31.994150 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:32 crc kubenswrapper[4715]: I1009 08:07:32.096632 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:32 crc kubenswrapper[4715]: I1009 08:07:32.096979 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:32 crc kubenswrapper[4715]: I1009 08:07:32.097084 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:32 crc kubenswrapper[4715]: I1009 08:07:32.097112 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncvxr\" (UniqueName: \"kubernetes.io/projected/1058ab3e-4f39-48ac-9f7e-81e40f041264-kube-api-access-ncvxr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:32 crc kubenswrapper[4715]: I1009 08:07:32.102861 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:32 crc kubenswrapper[4715]: I1009 08:07:32.106849 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:32 crc kubenswrapper[4715]: I1009 08:07:32.115727 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:32 crc kubenswrapper[4715]: I1009 08:07:32.125822 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncvxr\" (UniqueName: \"kubernetes.io/projected/1058ab3e-4f39-48ac-9f7e-81e40f041264-kube-api-access-ncvxr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:32 crc kubenswrapper[4715]: I1009 08:07:32.266012 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:32 crc kubenswrapper[4715]: I1009 08:07:32.828090 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc"] Oct 09 08:07:32 crc kubenswrapper[4715]: W1009 08:07:32.829956 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1058ab3e_4f39_48ac_9f7e_81e40f041264.slice/crio-d6bdf50a0b0172e39d103e056f3720f34d6f92c0233add056e2c834b234b3e35 WatchSource:0}: Error finding container d6bdf50a0b0172e39d103e056f3720f34d6f92c0233add056e2c834b234b3e35: Status 404 returned error can't find the container with id d6bdf50a0b0172e39d103e056f3720f34d6f92c0233add056e2c834b234b3e35 Oct 09 08:07:32 crc kubenswrapper[4715]: I1009 08:07:32.832682 4715 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 08:07:33 crc kubenswrapper[4715]: I1009 08:07:33.765759 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" event={"ID":"1058ab3e-4f39-48ac-9f7e-81e40f041264","Type":"ContainerStarted","Data":"d6bdf50a0b0172e39d103e056f3720f34d6f92c0233add056e2c834b234b3e35"} Oct 09 08:07:41 crc kubenswrapper[4715]: I1009 08:07:41.843682 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" event={"ID":"1058ab3e-4f39-48ac-9f7e-81e40f041264","Type":"ContainerStarted","Data":"9557f9bc81458950ba9b70ac7223b81079781e6eb4998bcaca22a53ff6909ede"} Oct 09 08:07:41 crc kubenswrapper[4715]: I1009 08:07:41.872250 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" podStartSLOduration=2.86681282 podStartE2EDuration="10.872227803s" podCreationTimestamp="2025-10-09 08:07:31 +0000 UTC" firstStartedPulling="2025-10-09 08:07:32.832473621 +0000 UTC m=+1283.525277619" lastFinishedPulling="2025-10-09 08:07:40.837888584 +0000 UTC m=+1291.530692602" observedRunningTime="2025-10-09 08:07:41.871086371 +0000 UTC m=+1292.563890419" watchObservedRunningTime="2025-10-09 08:07:41.872227803 +0000 UTC m=+1292.565031821" Oct 09 08:07:46 crc kubenswrapper[4715]: I1009 08:07:46.894500 4715 generic.go:334] "Generic (PLEG): container finished" podID="c87001f3-a098-449a-b8ec-cccb2a313d5f" containerID="afbbd5bab5df112380ceb335943cb58777df2cd387766ddc6422d7144cbfaeef" exitCode=0 Oct 09 08:07:46 crc kubenswrapper[4715]: I1009 08:07:46.894613 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c87001f3-a098-449a-b8ec-cccb2a313d5f","Type":"ContainerDied","Data":"afbbd5bab5df112380ceb335943cb58777df2cd387766ddc6422d7144cbfaeef"} Oct 09 08:07:47 crc kubenswrapper[4715]: I1009 08:07:47.906568 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c87001f3-a098-449a-b8ec-cccb2a313d5f","Type":"ContainerStarted","Data":"dedc4a4111e58f17afe7e88ed7e41f2f9218e560158333f377f9d537ac8ef06b"} Oct 09 08:07:47 crc kubenswrapper[4715]: I1009 08:07:47.907202 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 09 08:07:47 crc kubenswrapper[4715]: I1009 08:07:47.907733 4715 generic.go:334] "Generic (PLEG): container finished" podID="a2bc3ad0-34e4-4ccc-9abd-7e998940780c" containerID="5ad32f351c006f9cf024eb67e2c83c0b79a11d3c7974f856cf621da2c2b592e8" exitCode=0 Oct 09 08:07:47 crc kubenswrapper[4715]: I1009 08:07:47.907776 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2bc3ad0-34e4-4ccc-9abd-7e998940780c","Type":"ContainerDied","Data":"5ad32f351c006f9cf024eb67e2c83c0b79a11d3c7974f856cf621da2c2b592e8"} Oct 09 08:07:47 crc kubenswrapper[4715]: I1009 08:07:47.938717 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.938696084 podStartE2EDuration="36.938696084s" podCreationTimestamp="2025-10-09 08:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:07:47.933701044 +0000 UTC m=+1298.626505062" watchObservedRunningTime="2025-10-09 08:07:47.938696084 +0000 UTC m=+1298.631500092" Oct 09 08:07:48 crc kubenswrapper[4715]: I1009 08:07:48.917784 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2bc3ad0-34e4-4ccc-9abd-7e998940780c","Type":"ContainerStarted","Data":"6a23f47f31a8ffab718291f3a1dac20ba6a883255fcb614fc1388e1063fdbfee"} Oct 09 08:07:48 crc kubenswrapper[4715]: I1009 08:07:48.918329 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:07:48 crc kubenswrapper[4715]: I1009 08:07:48.950612 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.950594903 podStartE2EDuration="37.950594903s" podCreationTimestamp="2025-10-09 08:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:07:48.946079366 +0000 UTC m=+1299.638883404" watchObservedRunningTime="2025-10-09 08:07:48.950594903 +0000 UTC m=+1299.643398911" Oct 09 08:07:51 crc kubenswrapper[4715]: I1009 08:07:51.945136 4715 generic.go:334] "Generic (PLEG): container finished" podID="1058ab3e-4f39-48ac-9f7e-81e40f041264" containerID="9557f9bc81458950ba9b70ac7223b81079781e6eb4998bcaca22a53ff6909ede" exitCode=0 Oct 09 08:07:51 crc kubenswrapper[4715]: I1009 08:07:51.945282 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" event={"ID":"1058ab3e-4f39-48ac-9f7e-81e40f041264","Type":"ContainerDied","Data":"9557f9bc81458950ba9b70ac7223b81079781e6eb4998bcaca22a53ff6909ede"} Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.369337 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.503084 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncvxr\" (UniqueName: \"kubernetes.io/projected/1058ab3e-4f39-48ac-9f7e-81e40f041264-kube-api-access-ncvxr\") pod \"1058ab3e-4f39-48ac-9f7e-81e40f041264\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.503155 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-repo-setup-combined-ca-bundle\") pod \"1058ab3e-4f39-48ac-9f7e-81e40f041264\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.503250 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-inventory\") pod \"1058ab3e-4f39-48ac-9f7e-81e40f041264\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.503268 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-ssh-key\") pod \"1058ab3e-4f39-48ac-9f7e-81e40f041264\" (UID: \"1058ab3e-4f39-48ac-9f7e-81e40f041264\") " Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.509600 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1058ab3e-4f39-48ac-9f7e-81e40f041264" (UID: "1058ab3e-4f39-48ac-9f7e-81e40f041264"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.511646 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1058ab3e-4f39-48ac-9f7e-81e40f041264-kube-api-access-ncvxr" (OuterVolumeSpecName: "kube-api-access-ncvxr") pod "1058ab3e-4f39-48ac-9f7e-81e40f041264" (UID: "1058ab3e-4f39-48ac-9f7e-81e40f041264"). InnerVolumeSpecName "kube-api-access-ncvxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.530408 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-inventory" (OuterVolumeSpecName: "inventory") pod "1058ab3e-4f39-48ac-9f7e-81e40f041264" (UID: "1058ab3e-4f39-48ac-9f7e-81e40f041264"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.540496 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1058ab3e-4f39-48ac-9f7e-81e40f041264" (UID: "1058ab3e-4f39-48ac-9f7e-81e40f041264"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.605367 4715 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.605619 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.605736 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1058ab3e-4f39-48ac-9f7e-81e40f041264-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.605867 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncvxr\" (UniqueName: \"kubernetes.io/projected/1058ab3e-4f39-48ac-9f7e-81e40f041264-kube-api-access-ncvxr\") on node \"crc\" DevicePath \"\"" Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.965721 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" event={"ID":"1058ab3e-4f39-48ac-9f7e-81e40f041264","Type":"ContainerDied","Data":"d6bdf50a0b0172e39d103e056f3720f34d6f92c0233add056e2c834b234b3e35"} Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.966027 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6bdf50a0b0172e39d103e056f3720f34d6f92c0233add056e2c834b234b3e35" Oct 09 08:07:53 crc kubenswrapper[4715]: I1009 08:07:53.965790 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.152232 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls"] Oct 09 08:07:54 crc kubenswrapper[4715]: E1009 08:07:54.152715 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1058ab3e-4f39-48ac-9f7e-81e40f041264" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.152732 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1058ab3e-4f39-48ac-9f7e-81e40f041264" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.152956 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1058ab3e-4f39-48ac-9f7e-81e40f041264" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.153592 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.157974 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.158211 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.157990 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.160710 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.171881 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls"] Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.317711 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwj5x\" (UniqueName: \"kubernetes.io/projected/aa76af72-003a-4682-bbee-0ef470ecef9a-kube-api-access-mwj5x\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bzxls\" (UID: \"aa76af72-003a-4682-bbee-0ef470ecef9a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.318125 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa76af72-003a-4682-bbee-0ef470ecef9a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bzxls\" (UID: \"aa76af72-003a-4682-bbee-0ef470ecef9a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.318355 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa76af72-003a-4682-bbee-0ef470ecef9a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bzxls\" (UID: \"aa76af72-003a-4682-bbee-0ef470ecef9a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.425968 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa76af72-003a-4682-bbee-0ef470ecef9a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bzxls\" (UID: \"aa76af72-003a-4682-bbee-0ef470ecef9a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.426144 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa76af72-003a-4682-bbee-0ef470ecef9a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bzxls\" (UID: \"aa76af72-003a-4682-bbee-0ef470ecef9a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.426273 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwj5x\" (UniqueName: \"kubernetes.io/projected/aa76af72-003a-4682-bbee-0ef470ecef9a-kube-api-access-mwj5x\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bzxls\" (UID: \"aa76af72-003a-4682-bbee-0ef470ecef9a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.432322 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa76af72-003a-4682-bbee-0ef470ecef9a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bzxls\" (UID: \"aa76af72-003a-4682-bbee-0ef470ecef9a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.432345 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa76af72-003a-4682-bbee-0ef470ecef9a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bzxls\" (UID: \"aa76af72-003a-4682-bbee-0ef470ecef9a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.448704 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwj5x\" (UniqueName: \"kubernetes.io/projected/aa76af72-003a-4682-bbee-0ef470ecef9a-kube-api-access-mwj5x\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-bzxls\" (UID: \"aa76af72-003a-4682-bbee-0ef470ecef9a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:07:54 crc kubenswrapper[4715]: I1009 08:07:54.469597 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:07:55 crc kubenswrapper[4715]: I1009 08:07:55.555459 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls"] Oct 09 08:07:55 crc kubenswrapper[4715]: W1009 08:07:55.562689 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa76af72_003a_4682_bbee_0ef470ecef9a.slice/crio-4f68aced5f84229617b16e0ce2e7b541dc376a20925da611529ec680e16065ed WatchSource:0}: Error finding container 4f68aced5f84229617b16e0ce2e7b541dc376a20925da611529ec680e16065ed: Status 404 returned error can't find the container with id 4f68aced5f84229617b16e0ce2e7b541dc376a20925da611529ec680e16065ed Oct 09 08:07:55 crc kubenswrapper[4715]: I1009 08:07:55.986735 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" event={"ID":"aa76af72-003a-4682-bbee-0ef470ecef9a","Type":"ContainerStarted","Data":"4f68aced5f84229617b16e0ce2e7b541dc376a20925da611529ec680e16065ed"} Oct 09 08:07:57 crc kubenswrapper[4715]: I1009 08:07:57.000537 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" event={"ID":"aa76af72-003a-4682-bbee-0ef470ecef9a","Type":"ContainerStarted","Data":"fc2eec0ac5a48222922c3551d6cfc738c7b21752bf2d095e4921ea38c9d2de59"} Oct 09 08:07:57 crc kubenswrapper[4715]: I1009 08:07:57.022229 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" podStartSLOduration=2.5493831179999997 podStartE2EDuration="3.022212826s" podCreationTimestamp="2025-10-09 08:07:54 +0000 UTC" firstStartedPulling="2025-10-09 08:07:55.565053005 +0000 UTC m=+1306.257857013" lastFinishedPulling="2025-10-09 08:07:56.037882713 +0000 UTC m=+1306.730686721" observedRunningTime="2025-10-09 08:07:57.018137651 +0000 UTC m=+1307.710941689" watchObservedRunningTime="2025-10-09 08:07:57.022212826 +0000 UTC m=+1307.715016834" Oct 09 08:07:59 crc kubenswrapper[4715]: I1009 08:07:59.023839 4715 generic.go:334] "Generic (PLEG): container finished" podID="aa76af72-003a-4682-bbee-0ef470ecef9a" containerID="fc2eec0ac5a48222922c3551d6cfc738c7b21752bf2d095e4921ea38c9d2de59" exitCode=0 Oct 09 08:07:59 crc kubenswrapper[4715]: I1009 08:07:59.023974 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" event={"ID":"aa76af72-003a-4682-bbee-0ef470ecef9a","Type":"ContainerDied","Data":"fc2eec0ac5a48222922c3551d6cfc738c7b21752bf2d095e4921ea38c9d2de59"} Oct 09 08:08:00 crc kubenswrapper[4715]: I1009 08:08:00.447338 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:08:00 crc kubenswrapper[4715]: I1009 08:08:00.548308 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa76af72-003a-4682-bbee-0ef470ecef9a-ssh-key\") pod \"aa76af72-003a-4682-bbee-0ef470ecef9a\" (UID: \"aa76af72-003a-4682-bbee-0ef470ecef9a\") " Oct 09 08:08:00 crc kubenswrapper[4715]: I1009 08:08:00.548671 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa76af72-003a-4682-bbee-0ef470ecef9a-inventory\") pod \"aa76af72-003a-4682-bbee-0ef470ecef9a\" (UID: \"aa76af72-003a-4682-bbee-0ef470ecef9a\") " Oct 09 08:08:00 crc kubenswrapper[4715]: I1009 08:08:00.548864 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwj5x\" (UniqueName: \"kubernetes.io/projected/aa76af72-003a-4682-bbee-0ef470ecef9a-kube-api-access-mwj5x\") pod \"aa76af72-003a-4682-bbee-0ef470ecef9a\" (UID: \"aa76af72-003a-4682-bbee-0ef470ecef9a\") " Oct 09 08:08:00 crc kubenswrapper[4715]: I1009 08:08:00.554236 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa76af72-003a-4682-bbee-0ef470ecef9a-kube-api-access-mwj5x" (OuterVolumeSpecName: "kube-api-access-mwj5x") pod "aa76af72-003a-4682-bbee-0ef470ecef9a" (UID: "aa76af72-003a-4682-bbee-0ef470ecef9a"). InnerVolumeSpecName "kube-api-access-mwj5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:08:00 crc kubenswrapper[4715]: I1009 08:08:00.576466 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa76af72-003a-4682-bbee-0ef470ecef9a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aa76af72-003a-4682-bbee-0ef470ecef9a" (UID: "aa76af72-003a-4682-bbee-0ef470ecef9a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:08:00 crc kubenswrapper[4715]: I1009 08:08:00.581905 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa76af72-003a-4682-bbee-0ef470ecef9a-inventory" (OuterVolumeSpecName: "inventory") pod "aa76af72-003a-4682-bbee-0ef470ecef9a" (UID: "aa76af72-003a-4682-bbee-0ef470ecef9a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:08:00 crc kubenswrapper[4715]: I1009 08:08:00.650805 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa76af72-003a-4682-bbee-0ef470ecef9a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:08:00 crc kubenswrapper[4715]: I1009 08:08:00.650830 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa76af72-003a-4682-bbee-0ef470ecef9a-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:08:00 crc kubenswrapper[4715]: I1009 08:08:00.650839 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwj5x\" (UniqueName: \"kubernetes.io/projected/aa76af72-003a-4682-bbee-0ef470ecef9a-kube-api-access-mwj5x\") on node \"crc\" DevicePath \"\"" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.046664 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" event={"ID":"aa76af72-003a-4682-bbee-0ef470ecef9a","Type":"ContainerDied","Data":"4f68aced5f84229617b16e0ce2e7b541dc376a20925da611529ec680e16065ed"} Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.047017 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f68aced5f84229617b16e0ce2e7b541dc376a20925da611529ec680e16065ed" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.046741 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-bzxls" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.112399 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg"] Oct 09 08:08:01 crc kubenswrapper[4715]: E1009 08:08:01.112810 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa76af72-003a-4682-bbee-0ef470ecef9a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.112827 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa76af72-003a-4682-bbee-0ef470ecef9a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.113004 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa76af72-003a-4682-bbee-0ef470ecef9a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.113668 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.116103 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.116450 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.116970 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.117221 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.122845 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg"] Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.263619 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.263682 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.263734 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.263880 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr7tp\" (UniqueName: \"kubernetes.io/projected/21e35629-c64d-4ef6-a570-7603aa8358fb-kube-api-access-jr7tp\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.366081 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.366833 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.366951 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.367188 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr7tp\" (UniqueName: \"kubernetes.io/projected/21e35629-c64d-4ef6-a570-7603aa8358fb-kube-api-access-jr7tp\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.371177 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.371240 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.372302 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.381712 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr7tp\" (UniqueName: \"kubernetes.io/projected/21e35629-c64d-4ef6-a570-7603aa8358fb-kube-api-access-jr7tp\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.434569 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.906784 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 09 08:08:01 crc kubenswrapper[4715]: I1009 08:08:01.957842 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg"] Oct 09 08:08:02 crc kubenswrapper[4715]: I1009 08:08:02.057693 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" event={"ID":"21e35629-c64d-4ef6-a570-7603aa8358fb","Type":"ContainerStarted","Data":"c4fd930a50e48d438841f1fc916cbebdf87e0d47823ab42180d00ccaeef4eb19"} Oct 09 08:08:02 crc kubenswrapper[4715]: I1009 08:08:02.331644 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 09 08:08:03 crc kubenswrapper[4715]: I1009 08:08:03.067653 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" event={"ID":"21e35629-c64d-4ef6-a570-7603aa8358fb","Type":"ContainerStarted","Data":"bba1234324b4097bb0be21ee76654b7a5183a04882eef97cf3e1666b724f1578"} Oct 09 08:08:03 crc kubenswrapper[4715]: I1009 08:08:03.089935 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" podStartSLOduration=1.645148071 podStartE2EDuration="2.089910261s" podCreationTimestamp="2025-10-09 08:08:01 +0000 UTC" firstStartedPulling="2025-10-09 08:08:02.00212262 +0000 UTC m=+1312.694926638" lastFinishedPulling="2025-10-09 08:08:02.44688482 +0000 UTC m=+1313.139688828" observedRunningTime="2025-10-09 08:08:03.089094859 +0000 UTC m=+1313.781898887" watchObservedRunningTime="2025-10-09 08:08:03.089910261 +0000 UTC m=+1313.782714269" Oct 09 08:08:11 crc kubenswrapper[4715]: I1009 08:08:11.591715 4715 scope.go:117] "RemoveContainer" containerID="75a982e1d44680c2b96f722a9924ebc5c281f5c8f11940251e6f50af37404454" Oct 09 08:09:11 crc kubenswrapper[4715]: I1009 08:09:11.683572 4715 scope.go:117] "RemoveContainer" containerID="d0fb21496c5e66aeecbf6a5984fed7d8dab239ddd4fbcf0ff397c5d9c9acb9e6" Oct 09 08:09:11 crc kubenswrapper[4715]: I1009 08:09:11.716116 4715 scope.go:117] "RemoveContainer" containerID="352c27b9fcd0e70c6baf453f0729ce49aad3113cf7887641013079ce070f4d4b" Oct 09 08:09:11 crc kubenswrapper[4715]: I1009 08:09:11.793297 4715 scope.go:117] "RemoveContainer" containerID="96111493a6fbd5d69aad7e7bb948021003031322bc366f71e9e97e9d2a638bcc" Oct 09 08:09:16 crc kubenswrapper[4715]: I1009 08:09:16.754626 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:09:16 crc kubenswrapper[4715]: I1009 08:09:16.755207 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:09:46 crc kubenswrapper[4715]: I1009 08:09:46.753648 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:09:46 crc kubenswrapper[4715]: I1009 08:09:46.754321 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:10:16 crc kubenswrapper[4715]: I1009 08:10:16.753334 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:10:16 crc kubenswrapper[4715]: I1009 08:10:16.753902 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:10:16 crc kubenswrapper[4715]: I1009 08:10:16.753950 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 08:10:16 crc kubenswrapper[4715]: I1009 08:10:16.754777 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c420b2a8eda30fb3123058bb99d74df66c4ad029fca95601a27c380e6d5834c9"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 08:10:16 crc kubenswrapper[4715]: I1009 08:10:16.754839 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://c420b2a8eda30fb3123058bb99d74df66c4ad029fca95601a27c380e6d5834c9" gracePeriod=600 Oct 09 08:10:17 crc kubenswrapper[4715]: I1009 08:10:17.401844 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="c420b2a8eda30fb3123058bb99d74df66c4ad029fca95601a27c380e6d5834c9" exitCode=0 Oct 09 08:10:17 crc kubenswrapper[4715]: I1009 08:10:17.401889 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"c420b2a8eda30fb3123058bb99d74df66c4ad029fca95601a27c380e6d5834c9"} Oct 09 08:10:17 crc kubenswrapper[4715]: I1009 08:10:17.402193 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c"} Oct 09 08:10:17 crc kubenswrapper[4715]: I1009 08:10:17.402219 4715 scope.go:117] "RemoveContainer" containerID="d50fe5031fb9148bf6f3fa221f298d39e3a132b33b247b66e3d4fb59f3f0c771" Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.128036 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4rg"] Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.130805 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.152296 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4rg"] Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.258161 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xbbz\" (UniqueName: \"kubernetes.io/projected/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-kube-api-access-9xbbz\") pod \"redhat-marketplace-mf4rg\" (UID: \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\") " pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.258356 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-catalog-content\") pod \"redhat-marketplace-mf4rg\" (UID: \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\") " pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.258401 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-utilities\") pod \"redhat-marketplace-mf4rg\" (UID: \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\") " pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.360876 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xbbz\" (UniqueName: \"kubernetes.io/projected/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-kube-api-access-9xbbz\") pod \"redhat-marketplace-mf4rg\" (UID: \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\") " pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.361168 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-catalog-content\") pod \"redhat-marketplace-mf4rg\" (UID: \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\") " pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.361277 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-utilities\") pod \"redhat-marketplace-mf4rg\" (UID: \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\") " pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.361709 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-catalog-content\") pod \"redhat-marketplace-mf4rg\" (UID: \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\") " pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.361769 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-utilities\") pod \"redhat-marketplace-mf4rg\" (UID: \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\") " pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.380707 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xbbz\" (UniqueName: \"kubernetes.io/projected/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-kube-api-access-9xbbz\") pod \"redhat-marketplace-mf4rg\" (UID: \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\") " pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.455966 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:29 crc kubenswrapper[4715]: I1009 08:10:29.914884 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4rg"] Oct 09 08:10:30 crc kubenswrapper[4715]: I1009 08:10:30.517045 4715 generic.go:334] "Generic (PLEG): container finished" podID="63a0cac4-cbce-4404-8f7c-45cdfd6335fa" containerID="87d3c14437f7d8d05d7725930d60efe8a441e1dff0da75aac2381cbac20c95bf" exitCode=0 Oct 09 08:10:30 crc kubenswrapper[4715]: I1009 08:10:30.517294 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4rg" event={"ID":"63a0cac4-cbce-4404-8f7c-45cdfd6335fa","Type":"ContainerDied","Data":"87d3c14437f7d8d05d7725930d60efe8a441e1dff0da75aac2381cbac20c95bf"} Oct 09 08:10:30 crc kubenswrapper[4715]: I1009 08:10:30.517317 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4rg" event={"ID":"63a0cac4-cbce-4404-8f7c-45cdfd6335fa","Type":"ContainerStarted","Data":"7ce5ab056e94757965f5e5dfb0f3a0a0323257b90c1b34844966c58ff7f2ba2a"} Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.505576 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9z69x"] Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.508082 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.514889 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9z69x"] Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.541878 4715 generic.go:334] "Generic (PLEG): container finished" podID="63a0cac4-cbce-4404-8f7c-45cdfd6335fa" containerID="d070198d0ddcacbbabfa004e7a8844711afc6d90ebc3a9c0b993732fe23be7d6" exitCode=0 Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.541930 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4rg" event={"ID":"63a0cac4-cbce-4404-8f7c-45cdfd6335fa","Type":"ContainerDied","Data":"d070198d0ddcacbbabfa004e7a8844711afc6d90ebc3a9c0b993732fe23be7d6"} Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.625177 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5s7z\" (UniqueName: \"kubernetes.io/projected/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-kube-api-access-h5s7z\") pod \"redhat-operators-9z69x\" (UID: \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\") " pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.625256 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-catalog-content\") pod \"redhat-operators-9z69x\" (UID: \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\") " pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.625564 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-utilities\") pod \"redhat-operators-9z69x\" (UID: \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\") " pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.727832 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-utilities\") pod \"redhat-operators-9z69x\" (UID: \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\") " pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.727954 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5s7z\" (UniqueName: \"kubernetes.io/projected/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-kube-api-access-h5s7z\") pod \"redhat-operators-9z69x\" (UID: \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\") " pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.727996 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-catalog-content\") pod \"redhat-operators-9z69x\" (UID: \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\") " pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.728774 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-utilities\") pod \"redhat-operators-9z69x\" (UID: \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\") " pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.728800 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-catalog-content\") pod \"redhat-operators-9z69x\" (UID: \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\") " pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.749442 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5s7z\" (UniqueName: \"kubernetes.io/projected/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-kube-api-access-h5s7z\") pod \"redhat-operators-9z69x\" (UID: \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\") " pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:32 crc kubenswrapper[4715]: I1009 08:10:32.831054 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:33 crc kubenswrapper[4715]: I1009 08:10:33.292628 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9z69x"] Oct 09 08:10:33 crc kubenswrapper[4715]: W1009 08:10:33.294781 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda91099e7_a9fa_4bc8_b2cd_2b6dde66e5ab.slice/crio-0119dc6edbb084542d433d58a7138cb4d4808c67e7813b8899f513bfcf4f8067 WatchSource:0}: Error finding container 0119dc6edbb084542d433d58a7138cb4d4808c67e7813b8899f513bfcf4f8067: Status 404 returned error can't find the container with id 0119dc6edbb084542d433d58a7138cb4d4808c67e7813b8899f513bfcf4f8067 Oct 09 08:10:33 crc kubenswrapper[4715]: I1009 08:10:33.556351 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4rg" event={"ID":"63a0cac4-cbce-4404-8f7c-45cdfd6335fa","Type":"ContainerStarted","Data":"27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443"} Oct 09 08:10:33 crc kubenswrapper[4715]: I1009 08:10:33.569463 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z69x" event={"ID":"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab","Type":"ContainerStarted","Data":"d3bab605e4c68c6a8277dd2eee92769658902ac7bbd54fb6b25006d1e6955491"} Oct 09 08:10:33 crc kubenswrapper[4715]: I1009 08:10:33.569515 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z69x" event={"ID":"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab","Type":"ContainerStarted","Data":"0119dc6edbb084542d433d58a7138cb4d4808c67e7813b8899f513bfcf4f8067"} Oct 09 08:10:33 crc kubenswrapper[4715]: I1009 08:10:33.575513 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mf4rg" podStartSLOduration=2.020438199 podStartE2EDuration="4.575500882s" podCreationTimestamp="2025-10-09 08:10:29 +0000 UTC" firstStartedPulling="2025-10-09 08:10:30.519837301 +0000 UTC m=+1461.212641309" lastFinishedPulling="2025-10-09 08:10:33.074899984 +0000 UTC m=+1463.767703992" observedRunningTime="2025-10-09 08:10:33.574600236 +0000 UTC m=+1464.267404244" watchObservedRunningTime="2025-10-09 08:10:33.575500882 +0000 UTC m=+1464.268304890" Oct 09 08:10:34 crc kubenswrapper[4715]: I1009 08:10:34.580627 4715 generic.go:334] "Generic (PLEG): container finished" podID="a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" containerID="d3bab605e4c68c6a8277dd2eee92769658902ac7bbd54fb6b25006d1e6955491" exitCode=0 Oct 09 08:10:34 crc kubenswrapper[4715]: I1009 08:10:34.580687 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z69x" event={"ID":"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab","Type":"ContainerDied","Data":"d3bab605e4c68c6a8277dd2eee92769658902ac7bbd54fb6b25006d1e6955491"} Oct 09 08:10:34 crc kubenswrapper[4715]: I1009 08:10:34.581048 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z69x" event={"ID":"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab","Type":"ContainerStarted","Data":"1554ebf1d790280dc1ea4e659370c1f8b6b08dc5098acba495fba7f9fbe9ed5f"} Oct 09 08:10:35 crc kubenswrapper[4715]: I1009 08:10:35.592968 4715 generic.go:334] "Generic (PLEG): container finished" podID="a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" containerID="1554ebf1d790280dc1ea4e659370c1f8b6b08dc5098acba495fba7f9fbe9ed5f" exitCode=0 Oct 09 08:10:35 crc kubenswrapper[4715]: I1009 08:10:35.593039 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z69x" event={"ID":"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab","Type":"ContainerDied","Data":"1554ebf1d790280dc1ea4e659370c1f8b6b08dc5098acba495fba7f9fbe9ed5f"} Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.611725 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z69x" event={"ID":"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab","Type":"ContainerStarted","Data":"ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0"} Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.648269 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9z69x" podStartSLOduration=2.009664475 podStartE2EDuration="4.648253186s" podCreationTimestamp="2025-10-09 08:10:32 +0000 UTC" firstStartedPulling="2025-10-09 08:10:33.571966221 +0000 UTC m=+1464.264770229" lastFinishedPulling="2025-10-09 08:10:36.210554932 +0000 UTC m=+1466.903358940" observedRunningTime="2025-10-09 08:10:36.642612986 +0000 UTC m=+1467.335417004" watchObservedRunningTime="2025-10-09 08:10:36.648253186 +0000 UTC m=+1467.341057194" Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.714608 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hll64"] Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.717002 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.728059 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hll64"] Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.806092 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c8e63-9e89-4e92-8f88-7d14fde116e5-utilities\") pod \"community-operators-hll64\" (UID: \"589c8e63-9e89-4e92-8f88-7d14fde116e5\") " pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.806175 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c8e63-9e89-4e92-8f88-7d14fde116e5-catalog-content\") pod \"community-operators-hll64\" (UID: \"589c8e63-9e89-4e92-8f88-7d14fde116e5\") " pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.806210 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5klhk\" (UniqueName: \"kubernetes.io/projected/589c8e63-9e89-4e92-8f88-7d14fde116e5-kube-api-access-5klhk\") pod \"community-operators-hll64\" (UID: \"589c8e63-9e89-4e92-8f88-7d14fde116e5\") " pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.908544 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c8e63-9e89-4e92-8f88-7d14fde116e5-utilities\") pod \"community-operators-hll64\" (UID: \"589c8e63-9e89-4e92-8f88-7d14fde116e5\") " pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.908642 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c8e63-9e89-4e92-8f88-7d14fde116e5-catalog-content\") pod \"community-operators-hll64\" (UID: \"589c8e63-9e89-4e92-8f88-7d14fde116e5\") " pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.908679 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5klhk\" (UniqueName: \"kubernetes.io/projected/589c8e63-9e89-4e92-8f88-7d14fde116e5-kube-api-access-5klhk\") pod \"community-operators-hll64\" (UID: \"589c8e63-9e89-4e92-8f88-7d14fde116e5\") " pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.908997 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c8e63-9e89-4e92-8f88-7d14fde116e5-utilities\") pod \"community-operators-hll64\" (UID: \"589c8e63-9e89-4e92-8f88-7d14fde116e5\") " pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.909182 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c8e63-9e89-4e92-8f88-7d14fde116e5-catalog-content\") pod \"community-operators-hll64\" (UID: \"589c8e63-9e89-4e92-8f88-7d14fde116e5\") " pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:36 crc kubenswrapper[4715]: I1009 08:10:36.931279 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5klhk\" (UniqueName: \"kubernetes.io/projected/589c8e63-9e89-4e92-8f88-7d14fde116e5-kube-api-access-5klhk\") pod \"community-operators-hll64\" (UID: \"589c8e63-9e89-4e92-8f88-7d14fde116e5\") " pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:37 crc kubenswrapper[4715]: I1009 08:10:37.047321 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:37 crc kubenswrapper[4715]: I1009 08:10:37.548534 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hll64"] Oct 09 08:10:37 crc kubenswrapper[4715]: W1009 08:10:37.565582 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod589c8e63_9e89_4e92_8f88_7d14fde116e5.slice/crio-f782a522c91ed4a82074cede8ef18c1b34f4f776581e89f2387cd113fa3fb50a WatchSource:0}: Error finding container f782a522c91ed4a82074cede8ef18c1b34f4f776581e89f2387cd113fa3fb50a: Status 404 returned error can't find the container with id f782a522c91ed4a82074cede8ef18c1b34f4f776581e89f2387cd113fa3fb50a Oct 09 08:10:37 crc kubenswrapper[4715]: I1009 08:10:37.622780 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hll64" event={"ID":"589c8e63-9e89-4e92-8f88-7d14fde116e5","Type":"ContainerStarted","Data":"f782a522c91ed4a82074cede8ef18c1b34f4f776581e89f2387cd113fa3fb50a"} Oct 09 08:10:38 crc kubenswrapper[4715]: I1009 08:10:38.633316 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hll64" event={"ID":"589c8e63-9e89-4e92-8f88-7d14fde116e5","Type":"ContainerStarted","Data":"d4d4a54d4e7ddd3adc60a079c3d1e6170ae1c17e53775bd914d92bcf39825f0d"} Oct 09 08:10:39 crc kubenswrapper[4715]: I1009 08:10:39.456887 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:39 crc kubenswrapper[4715]: I1009 08:10:39.457222 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:39 crc kubenswrapper[4715]: I1009 08:10:39.510102 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:39 crc kubenswrapper[4715]: I1009 08:10:39.644895 4715 generic.go:334] "Generic (PLEG): container finished" podID="589c8e63-9e89-4e92-8f88-7d14fde116e5" containerID="d4d4a54d4e7ddd3adc60a079c3d1e6170ae1c17e53775bd914d92bcf39825f0d" exitCode=0 Oct 09 08:10:39 crc kubenswrapper[4715]: I1009 08:10:39.644974 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hll64" event={"ID":"589c8e63-9e89-4e92-8f88-7d14fde116e5","Type":"ContainerDied","Data":"d4d4a54d4e7ddd3adc60a079c3d1e6170ae1c17e53775bd914d92bcf39825f0d"} Oct 09 08:10:39 crc kubenswrapper[4715]: I1009 08:10:39.690070 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:40 crc kubenswrapper[4715]: I1009 08:10:40.657920 4715 generic.go:334] "Generic (PLEG): container finished" podID="589c8e63-9e89-4e92-8f88-7d14fde116e5" containerID="8d5ce2760e840d8299e5a6904e6f0fd25f437bbaf73405c67efef273b04e7d02" exitCode=0 Oct 09 08:10:40 crc kubenswrapper[4715]: I1009 08:10:40.657978 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hll64" event={"ID":"589c8e63-9e89-4e92-8f88-7d14fde116e5","Type":"ContainerDied","Data":"8d5ce2760e840d8299e5a6904e6f0fd25f437bbaf73405c67efef273b04e7d02"} Oct 09 08:10:41 crc kubenswrapper[4715]: I1009 08:10:41.669858 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hll64" event={"ID":"589c8e63-9e89-4e92-8f88-7d14fde116e5","Type":"ContainerStarted","Data":"3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb"} Oct 09 08:10:41 crc kubenswrapper[4715]: I1009 08:10:41.691199 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hll64" podStartSLOduration=3.17097245 podStartE2EDuration="5.691178354s" podCreationTimestamp="2025-10-09 08:10:36 +0000 UTC" firstStartedPulling="2025-10-09 08:10:38.635097733 +0000 UTC m=+1469.327901741" lastFinishedPulling="2025-10-09 08:10:41.155303637 +0000 UTC m=+1471.848107645" observedRunningTime="2025-10-09 08:10:41.685768821 +0000 UTC m=+1472.378572839" watchObservedRunningTime="2025-10-09 08:10:41.691178354 +0000 UTC m=+1472.383982362" Oct 09 08:10:42 crc kubenswrapper[4715]: I1009 08:10:42.679892 4715 generic.go:334] "Generic (PLEG): container finished" podID="21e35629-c64d-4ef6-a570-7603aa8358fb" containerID="bba1234324b4097bb0be21ee76654b7a5183a04882eef97cf3e1666b724f1578" exitCode=0 Oct 09 08:10:42 crc kubenswrapper[4715]: I1009 08:10:42.679984 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" event={"ID":"21e35629-c64d-4ef6-a570-7603aa8358fb","Type":"ContainerDied","Data":"bba1234324b4097bb0be21ee76654b7a5183a04882eef97cf3e1666b724f1578"} Oct 09 08:10:42 crc kubenswrapper[4715]: I1009 08:10:42.832049 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:42 crc kubenswrapper[4715]: I1009 08:10:42.832121 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:42 crc kubenswrapper[4715]: I1009 08:10:42.894408 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:42 crc kubenswrapper[4715]: I1009 08:10:42.900640 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4rg"] Oct 09 08:10:42 crc kubenswrapper[4715]: I1009 08:10:42.900902 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mf4rg" podUID="63a0cac4-cbce-4404-8f7c-45cdfd6335fa" containerName="registry-server" containerID="cri-o://27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443" gracePeriod=2 Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.375464 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.541668 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-catalog-content\") pod \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\" (UID: \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\") " Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.541730 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xbbz\" (UniqueName: \"kubernetes.io/projected/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-kube-api-access-9xbbz\") pod \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\" (UID: \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\") " Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.541824 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-utilities\") pod \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\" (UID: \"63a0cac4-cbce-4404-8f7c-45cdfd6335fa\") " Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.542525 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-utilities" (OuterVolumeSpecName: "utilities") pod "63a0cac4-cbce-4404-8f7c-45cdfd6335fa" (UID: "63a0cac4-cbce-4404-8f7c-45cdfd6335fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.548255 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-kube-api-access-9xbbz" (OuterVolumeSpecName: "kube-api-access-9xbbz") pod "63a0cac4-cbce-4404-8f7c-45cdfd6335fa" (UID: "63a0cac4-cbce-4404-8f7c-45cdfd6335fa"). InnerVolumeSpecName "kube-api-access-9xbbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.554282 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63a0cac4-cbce-4404-8f7c-45cdfd6335fa" (UID: "63a0cac4-cbce-4404-8f7c-45cdfd6335fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.643964 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.644027 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.644052 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xbbz\" (UniqueName: \"kubernetes.io/projected/63a0cac4-cbce-4404-8f7c-45cdfd6335fa-kube-api-access-9xbbz\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.734548 4715 generic.go:334] "Generic (PLEG): container finished" podID="63a0cac4-cbce-4404-8f7c-45cdfd6335fa" containerID="27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443" exitCode=0 Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.734877 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4rg" event={"ID":"63a0cac4-cbce-4404-8f7c-45cdfd6335fa","Type":"ContainerDied","Data":"27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443"} Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.734923 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4rg" event={"ID":"63a0cac4-cbce-4404-8f7c-45cdfd6335fa","Type":"ContainerDied","Data":"7ce5ab056e94757965f5e5dfb0f3a0a0323257b90c1b34844966c58ff7f2ba2a"} Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.734945 4715 scope.go:117] "RemoveContainer" containerID="27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.735137 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf4rg" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.776844 4715 scope.go:117] "RemoveContainer" containerID="d070198d0ddcacbbabfa004e7a8844711afc6d90ebc3a9c0b993732fe23be7d6" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.778313 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4rg"] Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.785924 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4rg"] Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.792252 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.809783 4715 scope.go:117] "RemoveContainer" containerID="87d3c14437f7d8d05d7725930d60efe8a441e1dff0da75aac2381cbac20c95bf" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.855302 4715 scope.go:117] "RemoveContainer" containerID="27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443" Oct 09 08:10:43 crc kubenswrapper[4715]: E1009 08:10:43.855752 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443\": container with ID starting with 27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443 not found: ID does not exist" containerID="27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.855789 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443"} err="failed to get container status \"27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443\": rpc error: code = NotFound desc = could not find container \"27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443\": container with ID starting with 27e11095ce34137b7ddf308c999767c4d8cd3f4b414bf9e0ad5c4d47e40ba443 not found: ID does not exist" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.855809 4715 scope.go:117] "RemoveContainer" containerID="d070198d0ddcacbbabfa004e7a8844711afc6d90ebc3a9c0b993732fe23be7d6" Oct 09 08:10:43 crc kubenswrapper[4715]: E1009 08:10:43.856165 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d070198d0ddcacbbabfa004e7a8844711afc6d90ebc3a9c0b993732fe23be7d6\": container with ID starting with d070198d0ddcacbbabfa004e7a8844711afc6d90ebc3a9c0b993732fe23be7d6 not found: ID does not exist" containerID="d070198d0ddcacbbabfa004e7a8844711afc6d90ebc3a9c0b993732fe23be7d6" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.856190 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d070198d0ddcacbbabfa004e7a8844711afc6d90ebc3a9c0b993732fe23be7d6"} err="failed to get container status \"d070198d0ddcacbbabfa004e7a8844711afc6d90ebc3a9c0b993732fe23be7d6\": rpc error: code = NotFound desc = could not find container \"d070198d0ddcacbbabfa004e7a8844711afc6d90ebc3a9c0b993732fe23be7d6\": container with ID starting with d070198d0ddcacbbabfa004e7a8844711afc6d90ebc3a9c0b993732fe23be7d6 not found: ID does not exist" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.856207 4715 scope.go:117] "RemoveContainer" containerID="87d3c14437f7d8d05d7725930d60efe8a441e1dff0da75aac2381cbac20c95bf" Oct 09 08:10:43 crc kubenswrapper[4715]: E1009 08:10:43.856604 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87d3c14437f7d8d05d7725930d60efe8a441e1dff0da75aac2381cbac20c95bf\": container with ID starting with 87d3c14437f7d8d05d7725930d60efe8a441e1dff0da75aac2381cbac20c95bf not found: ID does not exist" containerID="87d3c14437f7d8d05d7725930d60efe8a441e1dff0da75aac2381cbac20c95bf" Oct 09 08:10:43 crc kubenswrapper[4715]: I1009 08:10:43.856625 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d3c14437f7d8d05d7725930d60efe8a441e1dff0da75aac2381cbac20c95bf"} err="failed to get container status \"87d3c14437f7d8d05d7725930d60efe8a441e1dff0da75aac2381cbac20c95bf\": rpc error: code = NotFound desc = could not find container \"87d3c14437f7d8d05d7725930d60efe8a441e1dff0da75aac2381cbac20c95bf\": container with ID starting with 87d3c14437f7d8d05d7725930d60efe8a441e1dff0da75aac2381cbac20c95bf not found: ID does not exist" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.098116 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.147791 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63a0cac4-cbce-4404-8f7c-45cdfd6335fa" path="/var/lib/kubelet/pods/63a0cac4-cbce-4404-8f7c-45cdfd6335fa/volumes" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.255850 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-bootstrap-combined-ca-bundle\") pod \"21e35629-c64d-4ef6-a570-7603aa8358fb\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.256267 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-inventory\") pod \"21e35629-c64d-4ef6-a570-7603aa8358fb\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.256457 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-ssh-key\") pod \"21e35629-c64d-4ef6-a570-7603aa8358fb\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.256486 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr7tp\" (UniqueName: \"kubernetes.io/projected/21e35629-c64d-4ef6-a570-7603aa8358fb-kube-api-access-jr7tp\") pod \"21e35629-c64d-4ef6-a570-7603aa8358fb\" (UID: \"21e35629-c64d-4ef6-a570-7603aa8358fb\") " Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.260117 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e35629-c64d-4ef6-a570-7603aa8358fb-kube-api-access-jr7tp" (OuterVolumeSpecName: "kube-api-access-jr7tp") pod "21e35629-c64d-4ef6-a570-7603aa8358fb" (UID: "21e35629-c64d-4ef6-a570-7603aa8358fb"). InnerVolumeSpecName "kube-api-access-jr7tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.260333 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "21e35629-c64d-4ef6-a570-7603aa8358fb" (UID: "21e35629-c64d-4ef6-a570-7603aa8358fb"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.283465 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "21e35629-c64d-4ef6-a570-7603aa8358fb" (UID: "21e35629-c64d-4ef6-a570-7603aa8358fb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.290667 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-inventory" (OuterVolumeSpecName: "inventory") pod "21e35629-c64d-4ef6-a570-7603aa8358fb" (UID: "21e35629-c64d-4ef6-a570-7603aa8358fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.361741 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.361789 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr7tp\" (UniqueName: \"kubernetes.io/projected/21e35629-c64d-4ef6-a570-7603aa8358fb-kube-api-access-jr7tp\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.361806 4715 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.361821 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21e35629-c64d-4ef6-a570-7603aa8358fb-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.761681 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.761674 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg" event={"ID":"21e35629-c64d-4ef6-a570-7603aa8358fb","Type":"ContainerDied","Data":"c4fd930a50e48d438841f1fc916cbebdf87e0d47823ab42180d00ccaeef4eb19"} Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.761895 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4fd930a50e48d438841f1fc916cbebdf87e0d47823ab42180d00ccaeef4eb19" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.898079 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7"] Oct 09 08:10:44 crc kubenswrapper[4715]: E1009 08:10:44.898638 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a0cac4-cbce-4404-8f7c-45cdfd6335fa" containerName="registry-server" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.898665 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a0cac4-cbce-4404-8f7c-45cdfd6335fa" containerName="registry-server" Oct 09 08:10:44 crc kubenswrapper[4715]: E1009 08:10:44.898688 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e35629-c64d-4ef6-a570-7603aa8358fb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.898703 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e35629-c64d-4ef6-a570-7603aa8358fb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 08:10:44 crc kubenswrapper[4715]: E1009 08:10:44.898747 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a0cac4-cbce-4404-8f7c-45cdfd6335fa" containerName="extract-utilities" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.898762 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a0cac4-cbce-4404-8f7c-45cdfd6335fa" containerName="extract-utilities" Oct 09 08:10:44 crc kubenswrapper[4715]: E1009 08:10:44.898785 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a0cac4-cbce-4404-8f7c-45cdfd6335fa" containerName="extract-content" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.898797 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a0cac4-cbce-4404-8f7c-45cdfd6335fa" containerName="extract-content" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.899173 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e35629-c64d-4ef6-a570-7603aa8358fb" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.899243 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="63a0cac4-cbce-4404-8f7c-45cdfd6335fa" containerName="registry-server" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.900677 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.904284 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.904471 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.905727 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.913024 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:10:44 crc kubenswrapper[4715]: I1009 08:10:44.918548 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7"] Oct 09 08:10:45 crc kubenswrapper[4715]: I1009 08:10:45.075238 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a796e5b4-3af9-4286-8e0f-44f5a026dc47-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hngw7\" (UID: \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:10:45 crc kubenswrapper[4715]: I1009 08:10:45.075714 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9t7v\" (UniqueName: \"kubernetes.io/projected/a796e5b4-3af9-4286-8e0f-44f5a026dc47-kube-api-access-s9t7v\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hngw7\" (UID: \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:10:45 crc kubenswrapper[4715]: I1009 08:10:45.075946 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a796e5b4-3af9-4286-8e0f-44f5a026dc47-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hngw7\" (UID: \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:10:45 crc kubenswrapper[4715]: I1009 08:10:45.177552 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a796e5b4-3af9-4286-8e0f-44f5a026dc47-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hngw7\" (UID: \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:10:45 crc kubenswrapper[4715]: I1009 08:10:45.177870 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9t7v\" (UniqueName: \"kubernetes.io/projected/a796e5b4-3af9-4286-8e0f-44f5a026dc47-kube-api-access-s9t7v\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hngw7\" (UID: \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:10:45 crc kubenswrapper[4715]: I1009 08:10:45.178070 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a796e5b4-3af9-4286-8e0f-44f5a026dc47-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hngw7\" (UID: \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:10:45 crc kubenswrapper[4715]: I1009 08:10:45.188733 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a796e5b4-3af9-4286-8e0f-44f5a026dc47-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hngw7\" (UID: \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:10:45 crc kubenswrapper[4715]: I1009 08:10:45.190813 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a796e5b4-3af9-4286-8e0f-44f5a026dc47-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hngw7\" (UID: \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:10:45 crc kubenswrapper[4715]: I1009 08:10:45.200291 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9t7v\" (UniqueName: \"kubernetes.io/projected/a796e5b4-3af9-4286-8e0f-44f5a026dc47-kube-api-access-s9t7v\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-hngw7\" (UID: \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:10:45 crc kubenswrapper[4715]: I1009 08:10:45.225004 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:10:45 crc kubenswrapper[4715]: I1009 08:10:45.813554 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7"] Oct 09 08:10:45 crc kubenswrapper[4715]: W1009 08:10:45.821704 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda796e5b4_3af9_4286_8e0f_44f5a026dc47.slice/crio-ed04ec6959ebb5370c3c0c3e831feb453b986a4344146f9c0bebfb2efb242354 WatchSource:0}: Error finding container ed04ec6959ebb5370c3c0c3e831feb453b986a4344146f9c0bebfb2efb242354: Status 404 returned error can't find the container with id ed04ec6959ebb5370c3c0c3e831feb453b986a4344146f9c0bebfb2efb242354 Oct 09 08:10:46 crc kubenswrapper[4715]: I1009 08:10:46.783639 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" event={"ID":"a796e5b4-3af9-4286-8e0f-44f5a026dc47","Type":"ContainerStarted","Data":"520a6c3913936562c7af2560b9f42c7a6c07b675f6ce6556667b79afc9e83ea9"} Oct 09 08:10:46 crc kubenswrapper[4715]: I1009 08:10:46.783976 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" event={"ID":"a796e5b4-3af9-4286-8e0f-44f5a026dc47","Type":"ContainerStarted","Data":"ed04ec6959ebb5370c3c0c3e831feb453b986a4344146f9c0bebfb2efb242354"} Oct 09 08:10:46 crc kubenswrapper[4715]: I1009 08:10:46.810908 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" podStartSLOduration=2.343484716 podStartE2EDuration="2.810889252s" podCreationTimestamp="2025-10-09 08:10:44 +0000 UTC" firstStartedPulling="2025-10-09 08:10:45.826271477 +0000 UTC m=+1476.519075495" lastFinishedPulling="2025-10-09 08:10:46.293676013 +0000 UTC m=+1476.986480031" observedRunningTime="2025-10-09 08:10:46.806026064 +0000 UTC m=+1477.498830082" watchObservedRunningTime="2025-10-09 08:10:46.810889252 +0000 UTC m=+1477.503693260" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.048317 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.048379 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.099833 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9z69x"] Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.100184 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9z69x" podUID="a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" containerName="registry-server" containerID="cri-o://ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0" gracePeriod=2 Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.125613 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.577354 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.729263 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5s7z\" (UniqueName: \"kubernetes.io/projected/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-kube-api-access-h5s7z\") pod \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\" (UID: \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\") " Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.729551 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-utilities\") pod \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\" (UID: \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\") " Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.729622 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-catalog-content\") pod \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\" (UID: \"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab\") " Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.730372 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-utilities" (OuterVolumeSpecName: "utilities") pod "a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" (UID: "a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.734487 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-kube-api-access-h5s7z" (OuterVolumeSpecName: "kube-api-access-h5s7z") pod "a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" (UID: "a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab"). InnerVolumeSpecName "kube-api-access-h5s7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.798761 4715 generic.go:334] "Generic (PLEG): container finished" podID="a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" containerID="ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0" exitCode=0 Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.798828 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9z69x" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.798881 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z69x" event={"ID":"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab","Type":"ContainerDied","Data":"ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0"} Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.798908 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9z69x" event={"ID":"a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab","Type":"ContainerDied","Data":"0119dc6edbb084542d433d58a7138cb4d4808c67e7813b8899f513bfcf4f8067"} Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.798925 4715 scope.go:117] "RemoveContainer" containerID="ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.832081 4715 scope.go:117] "RemoveContainer" containerID="1554ebf1d790280dc1ea4e659370c1f8b6b08dc5098acba495fba7f9fbe9ed5f" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.832370 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.832402 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5s7z\" (UniqueName: \"kubernetes.io/projected/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-kube-api-access-h5s7z\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.838985 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" (UID: "a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.854272 4715 scope.go:117] "RemoveContainer" containerID="d3bab605e4c68c6a8277dd2eee92769658902ac7bbd54fb6b25006d1e6955491" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.854683 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.901185 4715 scope.go:117] "RemoveContainer" containerID="ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0" Oct 09 08:10:47 crc kubenswrapper[4715]: E1009 08:10:47.901666 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0\": container with ID starting with ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0 not found: ID does not exist" containerID="ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.901696 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0"} err="failed to get container status \"ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0\": rpc error: code = NotFound desc = could not find container \"ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0\": container with ID starting with ec12f3ae6358f8ca8deb3baeaf7a4ea53944a12e4ddbb37ced8105679dcfa3d0 not found: ID does not exist" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.901716 4715 scope.go:117] "RemoveContainer" containerID="1554ebf1d790280dc1ea4e659370c1f8b6b08dc5098acba495fba7f9fbe9ed5f" Oct 09 08:10:47 crc kubenswrapper[4715]: E1009 08:10:47.902059 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1554ebf1d790280dc1ea4e659370c1f8b6b08dc5098acba495fba7f9fbe9ed5f\": container with ID starting with 1554ebf1d790280dc1ea4e659370c1f8b6b08dc5098acba495fba7f9fbe9ed5f not found: ID does not exist" containerID="1554ebf1d790280dc1ea4e659370c1f8b6b08dc5098acba495fba7f9fbe9ed5f" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.902089 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1554ebf1d790280dc1ea4e659370c1f8b6b08dc5098acba495fba7f9fbe9ed5f"} err="failed to get container status \"1554ebf1d790280dc1ea4e659370c1f8b6b08dc5098acba495fba7f9fbe9ed5f\": rpc error: code = NotFound desc = could not find container \"1554ebf1d790280dc1ea4e659370c1f8b6b08dc5098acba495fba7f9fbe9ed5f\": container with ID starting with 1554ebf1d790280dc1ea4e659370c1f8b6b08dc5098acba495fba7f9fbe9ed5f not found: ID does not exist" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.902105 4715 scope.go:117] "RemoveContainer" containerID="d3bab605e4c68c6a8277dd2eee92769658902ac7bbd54fb6b25006d1e6955491" Oct 09 08:10:47 crc kubenswrapper[4715]: E1009 08:10:47.902358 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3bab605e4c68c6a8277dd2eee92769658902ac7bbd54fb6b25006d1e6955491\": container with ID starting with d3bab605e4c68c6a8277dd2eee92769658902ac7bbd54fb6b25006d1e6955491 not found: ID does not exist" containerID="d3bab605e4c68c6a8277dd2eee92769658902ac7bbd54fb6b25006d1e6955491" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.902377 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3bab605e4c68c6a8277dd2eee92769658902ac7bbd54fb6b25006d1e6955491"} err="failed to get container status \"d3bab605e4c68c6a8277dd2eee92769658902ac7bbd54fb6b25006d1e6955491\": rpc error: code = NotFound desc = could not find container \"d3bab605e4c68c6a8277dd2eee92769658902ac7bbd54fb6b25006d1e6955491\": container with ID starting with d3bab605e4c68c6a8277dd2eee92769658902ac7bbd54fb6b25006d1e6955491 not found: ID does not exist" Oct 09 08:10:47 crc kubenswrapper[4715]: I1009 08:10:47.934563 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:48 crc kubenswrapper[4715]: I1009 08:10:48.134545 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9z69x"] Oct 09 08:10:48 crc kubenswrapper[4715]: I1009 08:10:48.160222 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9z69x"] Oct 09 08:10:49 crc kubenswrapper[4715]: I1009 08:10:49.497464 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hll64"] Oct 09 08:10:49 crc kubenswrapper[4715]: I1009 08:10:49.818743 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hll64" podUID="589c8e63-9e89-4e92-8f88-7d14fde116e5" containerName="registry-server" containerID="cri-o://3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb" gracePeriod=2 Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.148578 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" path="/var/lib/kubelet/pods/a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab/volumes" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.234836 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.381146 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c8e63-9e89-4e92-8f88-7d14fde116e5-utilities\") pod \"589c8e63-9e89-4e92-8f88-7d14fde116e5\" (UID: \"589c8e63-9e89-4e92-8f88-7d14fde116e5\") " Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.381316 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5klhk\" (UniqueName: \"kubernetes.io/projected/589c8e63-9e89-4e92-8f88-7d14fde116e5-kube-api-access-5klhk\") pod \"589c8e63-9e89-4e92-8f88-7d14fde116e5\" (UID: \"589c8e63-9e89-4e92-8f88-7d14fde116e5\") " Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.381414 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c8e63-9e89-4e92-8f88-7d14fde116e5-catalog-content\") pod \"589c8e63-9e89-4e92-8f88-7d14fde116e5\" (UID: \"589c8e63-9e89-4e92-8f88-7d14fde116e5\") " Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.383331 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/589c8e63-9e89-4e92-8f88-7d14fde116e5-utilities" (OuterVolumeSpecName: "utilities") pod "589c8e63-9e89-4e92-8f88-7d14fde116e5" (UID: "589c8e63-9e89-4e92-8f88-7d14fde116e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.387472 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/589c8e63-9e89-4e92-8f88-7d14fde116e5-kube-api-access-5klhk" (OuterVolumeSpecName: "kube-api-access-5klhk") pod "589c8e63-9e89-4e92-8f88-7d14fde116e5" (UID: "589c8e63-9e89-4e92-8f88-7d14fde116e5"). InnerVolumeSpecName "kube-api-access-5klhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.446256 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/589c8e63-9e89-4e92-8f88-7d14fde116e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "589c8e63-9e89-4e92-8f88-7d14fde116e5" (UID: "589c8e63-9e89-4e92-8f88-7d14fde116e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.484632 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/589c8e63-9e89-4e92-8f88-7d14fde116e5-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.484677 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5klhk\" (UniqueName: \"kubernetes.io/projected/589c8e63-9e89-4e92-8f88-7d14fde116e5-kube-api-access-5klhk\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.484696 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/589c8e63-9e89-4e92-8f88-7d14fde116e5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.832606 4715 generic.go:334] "Generic (PLEG): container finished" podID="589c8e63-9e89-4e92-8f88-7d14fde116e5" containerID="3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb" exitCode=0 Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.832651 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hll64" event={"ID":"589c8e63-9e89-4e92-8f88-7d14fde116e5","Type":"ContainerDied","Data":"3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb"} Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.832682 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hll64" event={"ID":"589c8e63-9e89-4e92-8f88-7d14fde116e5","Type":"ContainerDied","Data":"f782a522c91ed4a82074cede8ef18c1b34f4f776581e89f2387cd113fa3fb50a"} Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.832702 4715 scope.go:117] "RemoveContainer" containerID="3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.832865 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hll64" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.865580 4715 scope.go:117] "RemoveContainer" containerID="8d5ce2760e840d8299e5a6904e6f0fd25f437bbaf73405c67efef273b04e7d02" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.867789 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hll64"] Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.875960 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hll64"] Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.904990 4715 scope.go:117] "RemoveContainer" containerID="d4d4a54d4e7ddd3adc60a079c3d1e6170ae1c17e53775bd914d92bcf39825f0d" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.947543 4715 scope.go:117] "RemoveContainer" containerID="3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb" Oct 09 08:10:50 crc kubenswrapper[4715]: E1009 08:10:50.948385 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb\": container with ID starting with 3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb not found: ID does not exist" containerID="3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.948459 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb"} err="failed to get container status \"3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb\": rpc error: code = NotFound desc = could not find container \"3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb\": container with ID starting with 3da349d8d5474ca0a6bbc1668bbae890cec8ed319e04a683ad7955224602c2fb not found: ID does not exist" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.948492 4715 scope.go:117] "RemoveContainer" containerID="8d5ce2760e840d8299e5a6904e6f0fd25f437bbaf73405c67efef273b04e7d02" Oct 09 08:10:50 crc kubenswrapper[4715]: E1009 08:10:50.949011 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d5ce2760e840d8299e5a6904e6f0fd25f437bbaf73405c67efef273b04e7d02\": container with ID starting with 8d5ce2760e840d8299e5a6904e6f0fd25f437bbaf73405c67efef273b04e7d02 not found: ID does not exist" containerID="8d5ce2760e840d8299e5a6904e6f0fd25f437bbaf73405c67efef273b04e7d02" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.949041 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d5ce2760e840d8299e5a6904e6f0fd25f437bbaf73405c67efef273b04e7d02"} err="failed to get container status \"8d5ce2760e840d8299e5a6904e6f0fd25f437bbaf73405c67efef273b04e7d02\": rpc error: code = NotFound desc = could not find container \"8d5ce2760e840d8299e5a6904e6f0fd25f437bbaf73405c67efef273b04e7d02\": container with ID starting with 8d5ce2760e840d8299e5a6904e6f0fd25f437bbaf73405c67efef273b04e7d02 not found: ID does not exist" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.949061 4715 scope.go:117] "RemoveContainer" containerID="d4d4a54d4e7ddd3adc60a079c3d1e6170ae1c17e53775bd914d92bcf39825f0d" Oct 09 08:10:50 crc kubenswrapper[4715]: E1009 08:10:50.949477 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d4a54d4e7ddd3adc60a079c3d1e6170ae1c17e53775bd914d92bcf39825f0d\": container with ID starting with d4d4a54d4e7ddd3adc60a079c3d1e6170ae1c17e53775bd914d92bcf39825f0d not found: ID does not exist" containerID="d4d4a54d4e7ddd3adc60a079c3d1e6170ae1c17e53775bd914d92bcf39825f0d" Oct 09 08:10:50 crc kubenswrapper[4715]: I1009 08:10:50.949531 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d4a54d4e7ddd3adc60a079c3d1e6170ae1c17e53775bd914d92bcf39825f0d"} err="failed to get container status \"d4d4a54d4e7ddd3adc60a079c3d1e6170ae1c17e53775bd914d92bcf39825f0d\": rpc error: code = NotFound desc = could not find container \"d4d4a54d4e7ddd3adc60a079c3d1e6170ae1c17e53775bd914d92bcf39825f0d\": container with ID starting with d4d4a54d4e7ddd3adc60a079c3d1e6170ae1c17e53775bd914d92bcf39825f0d not found: ID does not exist" Oct 09 08:10:52 crc kubenswrapper[4715]: I1009 08:10:52.157784 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="589c8e63-9e89-4e92-8f88-7d14fde116e5" path="/var/lib/kubelet/pods/589c8e63-9e89-4e92-8f88-7d14fde116e5/volumes" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.861545 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zxbqp"] Oct 09 08:11:05 crc kubenswrapper[4715]: E1009 08:11:05.864290 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589c8e63-9e89-4e92-8f88-7d14fde116e5" containerName="extract-utilities" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.864310 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="589c8e63-9e89-4e92-8f88-7d14fde116e5" containerName="extract-utilities" Oct 09 08:11:05 crc kubenswrapper[4715]: E1009 08:11:05.864345 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" containerName="extract-utilities" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.864356 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" containerName="extract-utilities" Oct 09 08:11:05 crc kubenswrapper[4715]: E1009 08:11:05.864383 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" containerName="registry-server" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.864395 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" containerName="registry-server" Oct 09 08:11:05 crc kubenswrapper[4715]: E1009 08:11:05.864413 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589c8e63-9e89-4e92-8f88-7d14fde116e5" containerName="extract-content" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.864449 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="589c8e63-9e89-4e92-8f88-7d14fde116e5" containerName="extract-content" Oct 09 08:11:05 crc kubenswrapper[4715]: E1009 08:11:05.864465 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589c8e63-9e89-4e92-8f88-7d14fde116e5" containerName="registry-server" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.864473 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="589c8e63-9e89-4e92-8f88-7d14fde116e5" containerName="registry-server" Oct 09 08:11:05 crc kubenswrapper[4715]: E1009 08:11:05.864503 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" containerName="extract-content" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.864513 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" containerName="extract-content" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.864766 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="589c8e63-9e89-4e92-8f88-7d14fde116e5" containerName="registry-server" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.864808 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="a91099e7-a9fa-4bc8-b2cd-2b6dde66e5ab" containerName="registry-server" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.866515 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.876236 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zxbqp"] Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.951921 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-utilities\") pod \"certified-operators-zxbqp\" (UID: \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\") " pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.952085 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-catalog-content\") pod \"certified-operators-zxbqp\" (UID: \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\") " pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:05 crc kubenswrapper[4715]: I1009 08:11:05.952142 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgdkr\" (UniqueName: \"kubernetes.io/projected/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-kube-api-access-bgdkr\") pod \"certified-operators-zxbqp\" (UID: \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\") " pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:06 crc kubenswrapper[4715]: I1009 08:11:06.053328 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-catalog-content\") pod \"certified-operators-zxbqp\" (UID: \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\") " pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:06 crc kubenswrapper[4715]: I1009 08:11:06.053760 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgdkr\" (UniqueName: \"kubernetes.io/projected/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-kube-api-access-bgdkr\") pod \"certified-operators-zxbqp\" (UID: \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\") " pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:06 crc kubenswrapper[4715]: I1009 08:11:06.053866 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-utilities\") pod \"certified-operators-zxbqp\" (UID: \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\") " pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:06 crc kubenswrapper[4715]: I1009 08:11:06.054406 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-utilities\") pod \"certified-operators-zxbqp\" (UID: \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\") " pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:06 crc kubenswrapper[4715]: I1009 08:11:06.054439 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-catalog-content\") pod \"certified-operators-zxbqp\" (UID: \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\") " pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:06 crc kubenswrapper[4715]: I1009 08:11:06.073740 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgdkr\" (UniqueName: \"kubernetes.io/projected/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-kube-api-access-bgdkr\") pod \"certified-operators-zxbqp\" (UID: \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\") " pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:06 crc kubenswrapper[4715]: I1009 08:11:06.199741 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:06 crc kubenswrapper[4715]: I1009 08:11:06.696405 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zxbqp"] Oct 09 08:11:06 crc kubenswrapper[4715]: W1009 08:11:06.699806 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bb83ec3_8754_4a19_a79a_5b4cd2041b34.slice/crio-6676b3c1188753c406eb3b276e9d1d94b21bc198a7d4355e9394bdc22de7aaa4 WatchSource:0}: Error finding container 6676b3c1188753c406eb3b276e9d1d94b21bc198a7d4355e9394bdc22de7aaa4: Status 404 returned error can't find the container with id 6676b3c1188753c406eb3b276e9d1d94b21bc198a7d4355e9394bdc22de7aaa4 Oct 09 08:11:06 crc kubenswrapper[4715]: I1009 08:11:06.988571 4715 generic.go:334] "Generic (PLEG): container finished" podID="9bb83ec3-8754-4a19-a79a-5b4cd2041b34" containerID="a927a65f3b859031800df548b94007d428f15daed1b857fd1436badf9ff29188" exitCode=0 Oct 09 08:11:06 crc kubenswrapper[4715]: I1009 08:11:06.988643 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxbqp" event={"ID":"9bb83ec3-8754-4a19-a79a-5b4cd2041b34","Type":"ContainerDied","Data":"a927a65f3b859031800df548b94007d428f15daed1b857fd1436badf9ff29188"} Oct 09 08:11:06 crc kubenswrapper[4715]: I1009 08:11:06.988744 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxbqp" event={"ID":"9bb83ec3-8754-4a19-a79a-5b4cd2041b34","Type":"ContainerStarted","Data":"6676b3c1188753c406eb3b276e9d1d94b21bc198a7d4355e9394bdc22de7aaa4"} Oct 09 08:11:08 crc kubenswrapper[4715]: I1009 08:11:08.012952 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxbqp" event={"ID":"9bb83ec3-8754-4a19-a79a-5b4cd2041b34","Type":"ContainerStarted","Data":"b9135fea5e343ae42123a34c3f9b27720dc8cdfd9bcc93345e0fdc7334180ba3"} Oct 09 08:11:09 crc kubenswrapper[4715]: I1009 08:11:09.032836 4715 generic.go:334] "Generic (PLEG): container finished" podID="9bb83ec3-8754-4a19-a79a-5b4cd2041b34" containerID="b9135fea5e343ae42123a34c3f9b27720dc8cdfd9bcc93345e0fdc7334180ba3" exitCode=0 Oct 09 08:11:09 crc kubenswrapper[4715]: I1009 08:11:09.032994 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxbqp" event={"ID":"9bb83ec3-8754-4a19-a79a-5b4cd2041b34","Type":"ContainerDied","Data":"b9135fea5e343ae42123a34c3f9b27720dc8cdfd9bcc93345e0fdc7334180ba3"} Oct 09 08:11:10 crc kubenswrapper[4715]: I1009 08:11:10.047205 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxbqp" event={"ID":"9bb83ec3-8754-4a19-a79a-5b4cd2041b34","Type":"ContainerStarted","Data":"d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998"} Oct 09 08:11:10 crc kubenswrapper[4715]: I1009 08:11:10.070937 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zxbqp" podStartSLOduration=2.478417319 podStartE2EDuration="5.07091521s" podCreationTimestamp="2025-10-09 08:11:05 +0000 UTC" firstStartedPulling="2025-10-09 08:11:06.990008383 +0000 UTC m=+1497.682812391" lastFinishedPulling="2025-10-09 08:11:09.582506264 +0000 UTC m=+1500.275310282" observedRunningTime="2025-10-09 08:11:10.064560519 +0000 UTC m=+1500.757364557" watchObservedRunningTime="2025-10-09 08:11:10.07091521 +0000 UTC m=+1500.763719218" Oct 09 08:11:11 crc kubenswrapper[4715]: I1009 08:11:11.890713 4715 scope.go:117] "RemoveContainer" containerID="edc3751b9ca29e525c3cfbf9387b6fcbcca2c661e93118c7132634c8144fb48c" Oct 09 08:11:11 crc kubenswrapper[4715]: I1009 08:11:11.918954 4715 scope.go:117] "RemoveContainer" containerID="fc614e5e991ca86933f54913c851b597dbc1cbfde4ce77363a916f0b909fe727" Oct 09 08:11:16 crc kubenswrapper[4715]: I1009 08:11:16.200180 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:16 crc kubenswrapper[4715]: I1009 08:11:16.200718 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:16 crc kubenswrapper[4715]: I1009 08:11:16.243786 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:17 crc kubenswrapper[4715]: I1009 08:11:17.155581 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:17 crc kubenswrapper[4715]: I1009 08:11:17.208358 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zxbqp"] Oct 09 08:11:19 crc kubenswrapper[4715]: I1009 08:11:19.131330 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zxbqp" podUID="9bb83ec3-8754-4a19-a79a-5b4cd2041b34" containerName="registry-server" containerID="cri-o://d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998" gracePeriod=2 Oct 09 08:11:19 crc kubenswrapper[4715]: I1009 08:11:19.583457 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:19 crc kubenswrapper[4715]: I1009 08:11:19.731227 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgdkr\" (UniqueName: \"kubernetes.io/projected/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-kube-api-access-bgdkr\") pod \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\" (UID: \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\") " Oct 09 08:11:19 crc kubenswrapper[4715]: I1009 08:11:19.731351 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-catalog-content\") pod \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\" (UID: \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\") " Oct 09 08:11:19 crc kubenswrapper[4715]: I1009 08:11:19.731394 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-utilities\") pod \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\" (UID: \"9bb83ec3-8754-4a19-a79a-5b4cd2041b34\") " Oct 09 08:11:19 crc kubenswrapper[4715]: I1009 08:11:19.732689 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-utilities" (OuterVolumeSpecName: "utilities") pod "9bb83ec3-8754-4a19-a79a-5b4cd2041b34" (UID: "9bb83ec3-8754-4a19-a79a-5b4cd2041b34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:11:19 crc kubenswrapper[4715]: I1009 08:11:19.739198 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-kube-api-access-bgdkr" (OuterVolumeSpecName: "kube-api-access-bgdkr") pod "9bb83ec3-8754-4a19-a79a-5b4cd2041b34" (UID: "9bb83ec3-8754-4a19-a79a-5b4cd2041b34"). InnerVolumeSpecName "kube-api-access-bgdkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:11:19 crc kubenswrapper[4715]: I1009 08:11:19.790091 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bb83ec3-8754-4a19-a79a-5b4cd2041b34" (UID: "9bb83ec3-8754-4a19-a79a-5b4cd2041b34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:11:19 crc kubenswrapper[4715]: I1009 08:11:19.833358 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgdkr\" (UniqueName: \"kubernetes.io/projected/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-kube-api-access-bgdkr\") on node \"crc\" DevicePath \"\"" Oct 09 08:11:19 crc kubenswrapper[4715]: I1009 08:11:19.833391 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:11:19 crc kubenswrapper[4715]: I1009 08:11:19.833401 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bb83ec3-8754-4a19-a79a-5b4cd2041b34-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.146012 4715 generic.go:334] "Generic (PLEG): container finished" podID="9bb83ec3-8754-4a19-a79a-5b4cd2041b34" containerID="d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998" exitCode=0 Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.148087 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zxbqp" Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.149941 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxbqp" event={"ID":"9bb83ec3-8754-4a19-a79a-5b4cd2041b34","Type":"ContainerDied","Data":"d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998"} Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.150242 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxbqp" event={"ID":"9bb83ec3-8754-4a19-a79a-5b4cd2041b34","Type":"ContainerDied","Data":"6676b3c1188753c406eb3b276e9d1d94b21bc198a7d4355e9394bdc22de7aaa4"} Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.150285 4715 scope.go:117] "RemoveContainer" containerID="d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998" Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.184640 4715 scope.go:117] "RemoveContainer" containerID="b9135fea5e343ae42123a34c3f9b27720dc8cdfd9bcc93345e0fdc7334180ba3" Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.195318 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zxbqp"] Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.206656 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zxbqp"] Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.215688 4715 scope.go:117] "RemoveContainer" containerID="a927a65f3b859031800df548b94007d428f15daed1b857fd1436badf9ff29188" Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.268379 4715 scope.go:117] "RemoveContainer" containerID="d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998" Oct 09 08:11:20 crc kubenswrapper[4715]: E1009 08:11:20.268892 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998\": container with ID starting with d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998 not found: ID does not exist" containerID="d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998" Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.268928 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998"} err="failed to get container status \"d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998\": rpc error: code = NotFound desc = could not find container \"d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998\": container with ID starting with d8ffe683256dc115c7276f0e0496e97c5f0be01b9ff1378d7ef699a85e504998 not found: ID does not exist" Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.268955 4715 scope.go:117] "RemoveContainer" containerID="b9135fea5e343ae42123a34c3f9b27720dc8cdfd9bcc93345e0fdc7334180ba3" Oct 09 08:11:20 crc kubenswrapper[4715]: E1009 08:11:20.269406 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9135fea5e343ae42123a34c3f9b27720dc8cdfd9bcc93345e0fdc7334180ba3\": container with ID starting with b9135fea5e343ae42123a34c3f9b27720dc8cdfd9bcc93345e0fdc7334180ba3 not found: ID does not exist" containerID="b9135fea5e343ae42123a34c3f9b27720dc8cdfd9bcc93345e0fdc7334180ba3" Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.269454 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9135fea5e343ae42123a34c3f9b27720dc8cdfd9bcc93345e0fdc7334180ba3"} err="failed to get container status \"b9135fea5e343ae42123a34c3f9b27720dc8cdfd9bcc93345e0fdc7334180ba3\": rpc error: code = NotFound desc = could not find container \"b9135fea5e343ae42123a34c3f9b27720dc8cdfd9bcc93345e0fdc7334180ba3\": container with ID starting with b9135fea5e343ae42123a34c3f9b27720dc8cdfd9bcc93345e0fdc7334180ba3 not found: ID does not exist" Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.269472 4715 scope.go:117] "RemoveContainer" containerID="a927a65f3b859031800df548b94007d428f15daed1b857fd1436badf9ff29188" Oct 09 08:11:20 crc kubenswrapper[4715]: E1009 08:11:20.269779 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a927a65f3b859031800df548b94007d428f15daed1b857fd1436badf9ff29188\": container with ID starting with a927a65f3b859031800df548b94007d428f15daed1b857fd1436badf9ff29188 not found: ID does not exist" containerID="a927a65f3b859031800df548b94007d428f15daed1b857fd1436badf9ff29188" Oct 09 08:11:20 crc kubenswrapper[4715]: I1009 08:11:20.269818 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a927a65f3b859031800df548b94007d428f15daed1b857fd1436badf9ff29188"} err="failed to get container status \"a927a65f3b859031800df548b94007d428f15daed1b857fd1436badf9ff29188\": rpc error: code = NotFound desc = could not find container \"a927a65f3b859031800df548b94007d428f15daed1b857fd1436badf9ff29188\": container with ID starting with a927a65f3b859031800df548b94007d428f15daed1b857fd1436badf9ff29188 not found: ID does not exist" Oct 09 08:11:22 crc kubenswrapper[4715]: I1009 08:11:22.152357 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb83ec3-8754-4a19-a79a-5b4cd2041b34" path="/var/lib/kubelet/pods/9bb83ec3-8754-4a19-a79a-5b4cd2041b34/volumes" Oct 09 08:12:12 crc kubenswrapper[4715]: I1009 08:12:12.024605 4715 scope.go:117] "RemoveContainer" containerID="f57ca25581bd4ae4ec705d3a73279094b06babbe39783df93d6f13176b690e48" Oct 09 08:12:23 crc kubenswrapper[4715]: I1009 08:12:23.047263 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rxhd5"] Oct 09 08:12:23 crc kubenswrapper[4715]: I1009 08:12:23.061309 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rxhd5"] Oct 09 08:12:24 crc kubenswrapper[4715]: I1009 08:12:24.025016 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-cqpnw"] Oct 09 08:12:24 crc kubenswrapper[4715]: I1009 08:12:24.033276 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-cqpnw"] Oct 09 08:12:24 crc kubenswrapper[4715]: I1009 08:12:24.147483 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4186809-cc3c-4cfd-a6f2-4888990a3251" path="/var/lib/kubelet/pods/b4186809-cc3c-4cfd-a6f2-4888990a3251/volumes" Oct 09 08:12:24 crc kubenswrapper[4715]: I1009 08:12:24.147985 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b82dbc30-e8e0-4256-8af6-6536cf1c07f5" path="/var/lib/kubelet/pods/b82dbc30-e8e0-4256-8af6-6536cf1c07f5/volumes" Oct 09 08:12:26 crc kubenswrapper[4715]: I1009 08:12:26.821130 4715 generic.go:334] "Generic (PLEG): container finished" podID="a796e5b4-3af9-4286-8e0f-44f5a026dc47" containerID="520a6c3913936562c7af2560b9f42c7a6c07b675f6ce6556667b79afc9e83ea9" exitCode=0 Oct 09 08:12:26 crc kubenswrapper[4715]: I1009 08:12:26.821254 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" event={"ID":"a796e5b4-3af9-4286-8e0f-44f5a026dc47","Type":"ContainerDied","Data":"520a6c3913936562c7af2560b9f42c7a6c07b675f6ce6556667b79afc9e83ea9"} Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.253776 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.322085 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a796e5b4-3af9-4286-8e0f-44f5a026dc47-inventory\") pod \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\" (UID: \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\") " Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.322289 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9t7v\" (UniqueName: \"kubernetes.io/projected/a796e5b4-3af9-4286-8e0f-44f5a026dc47-kube-api-access-s9t7v\") pod \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\" (UID: \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\") " Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.322370 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a796e5b4-3af9-4286-8e0f-44f5a026dc47-ssh-key\") pod \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\" (UID: \"a796e5b4-3af9-4286-8e0f-44f5a026dc47\") " Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.334648 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a796e5b4-3af9-4286-8e0f-44f5a026dc47-kube-api-access-s9t7v" (OuterVolumeSpecName: "kube-api-access-s9t7v") pod "a796e5b4-3af9-4286-8e0f-44f5a026dc47" (UID: "a796e5b4-3af9-4286-8e0f-44f5a026dc47"). InnerVolumeSpecName "kube-api-access-s9t7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.350764 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a796e5b4-3af9-4286-8e0f-44f5a026dc47-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a796e5b4-3af9-4286-8e0f-44f5a026dc47" (UID: "a796e5b4-3af9-4286-8e0f-44f5a026dc47"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.352078 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a796e5b4-3af9-4286-8e0f-44f5a026dc47-inventory" (OuterVolumeSpecName: "inventory") pod "a796e5b4-3af9-4286-8e0f-44f5a026dc47" (UID: "a796e5b4-3af9-4286-8e0f-44f5a026dc47"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.424891 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9t7v\" (UniqueName: \"kubernetes.io/projected/a796e5b4-3af9-4286-8e0f-44f5a026dc47-kube-api-access-s9t7v\") on node \"crc\" DevicePath \"\"" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.424925 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a796e5b4-3af9-4286-8e0f-44f5a026dc47-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.424935 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a796e5b4-3af9-4286-8e0f-44f5a026dc47-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.850299 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" event={"ID":"a796e5b4-3af9-4286-8e0f-44f5a026dc47","Type":"ContainerDied","Data":"ed04ec6959ebb5370c3c0c3e831feb453b986a4344146f9c0bebfb2efb242354"} Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.850350 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed04ec6959ebb5370c3c0c3e831feb453b986a4344146f9c0bebfb2efb242354" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.850404 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-hngw7" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.933537 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx"] Oct 09 08:12:28 crc kubenswrapper[4715]: E1009 08:12:28.934328 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb83ec3-8754-4a19-a79a-5b4cd2041b34" containerName="extract-utilities" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.934369 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb83ec3-8754-4a19-a79a-5b4cd2041b34" containerName="extract-utilities" Oct 09 08:12:28 crc kubenswrapper[4715]: E1009 08:12:28.934401 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb83ec3-8754-4a19-a79a-5b4cd2041b34" containerName="registry-server" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.934417 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb83ec3-8754-4a19-a79a-5b4cd2041b34" containerName="registry-server" Oct 09 08:12:28 crc kubenswrapper[4715]: E1009 08:12:28.934570 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a796e5b4-3af9-4286-8e0f-44f5a026dc47" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.934585 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a796e5b4-3af9-4286-8e0f-44f5a026dc47" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 08:12:28 crc kubenswrapper[4715]: E1009 08:12:28.934612 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb83ec3-8754-4a19-a79a-5b4cd2041b34" containerName="extract-content" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.934625 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb83ec3-8754-4a19-a79a-5b4cd2041b34" containerName="extract-content" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.934978 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb83ec3-8754-4a19-a79a-5b4cd2041b34" containerName="registry-server" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.935066 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="a796e5b4-3af9-4286-8e0f-44f5a026dc47" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.936331 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.938826 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.938956 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.939755 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.941323 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:12:28 crc kubenswrapper[4715]: I1009 08:12:28.945356 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx"] Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.036206 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69340195-a6f5-4e04-823d-9d61548a14b9-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx\" (UID: \"69340195-a6f5-4e04-823d-9d61548a14b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.036609 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69340195-a6f5-4e04-823d-9d61548a14b9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx\" (UID: \"69340195-a6f5-4e04-823d-9d61548a14b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.036644 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr82q\" (UniqueName: \"kubernetes.io/projected/69340195-a6f5-4e04-823d-9d61548a14b9-kube-api-access-jr82q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx\" (UID: \"69340195-a6f5-4e04-823d-9d61548a14b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.037573 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-5gv9h"] Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.049581 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-5gv9h"] Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.137796 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69340195-a6f5-4e04-823d-9d61548a14b9-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx\" (UID: \"69340195-a6f5-4e04-823d-9d61548a14b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.137860 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69340195-a6f5-4e04-823d-9d61548a14b9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx\" (UID: \"69340195-a6f5-4e04-823d-9d61548a14b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.137883 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr82q\" (UniqueName: \"kubernetes.io/projected/69340195-a6f5-4e04-823d-9d61548a14b9-kube-api-access-jr82q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx\" (UID: \"69340195-a6f5-4e04-823d-9d61548a14b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.142783 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69340195-a6f5-4e04-823d-9d61548a14b9-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx\" (UID: \"69340195-a6f5-4e04-823d-9d61548a14b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.146219 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69340195-a6f5-4e04-823d-9d61548a14b9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx\" (UID: \"69340195-a6f5-4e04-823d-9d61548a14b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.158712 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr82q\" (UniqueName: \"kubernetes.io/projected/69340195-a6f5-4e04-823d-9d61548a14b9-kube-api-access-jr82q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx\" (UID: \"69340195-a6f5-4e04-823d-9d61548a14b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.264662 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.824931 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx"] Oct 09 08:12:29 crc kubenswrapper[4715]: W1009 08:12:29.835835 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69340195_a6f5_4e04_823d_9d61548a14b9.slice/crio-cc80e5da579066922b19f05004792744f611f307946787b33ab03fd8fe696776 WatchSource:0}: Error finding container cc80e5da579066922b19f05004792744f611f307946787b33ab03fd8fe696776: Status 404 returned error can't find the container with id cc80e5da579066922b19f05004792744f611f307946787b33ab03fd8fe696776 Oct 09 08:12:29 crc kubenswrapper[4715]: I1009 08:12:29.860118 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" event={"ID":"69340195-a6f5-4e04-823d-9d61548a14b9","Type":"ContainerStarted","Data":"cc80e5da579066922b19f05004792744f611f307946787b33ab03fd8fe696776"} Oct 09 08:12:30 crc kubenswrapper[4715]: I1009 08:12:30.148674 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e51441-6fc1-4841-849a-d17051e1769e" path="/var/lib/kubelet/pods/79e51441-6fc1-4841-849a-d17051e1769e/volumes" Oct 09 08:12:30 crc kubenswrapper[4715]: I1009 08:12:30.869223 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" event={"ID":"69340195-a6f5-4e04-823d-9d61548a14b9","Type":"ContainerStarted","Data":"7fc09f8411c82f01d08719e55e7823d13dc63eeafa80f3127a3d69f0c01e57bc"} Oct 09 08:12:30 crc kubenswrapper[4715]: I1009 08:12:30.886355 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" podStartSLOduration=2.468476956 podStartE2EDuration="2.886331913s" podCreationTimestamp="2025-10-09 08:12:28 +0000 UTC" firstStartedPulling="2025-10-09 08:12:29.840528993 +0000 UTC m=+1580.533333011" lastFinishedPulling="2025-10-09 08:12:30.25838395 +0000 UTC m=+1580.951187968" observedRunningTime="2025-10-09 08:12:30.883302507 +0000 UTC m=+1581.576106525" watchObservedRunningTime="2025-10-09 08:12:30.886331913 +0000 UTC m=+1581.579135931" Oct 09 08:12:34 crc kubenswrapper[4715]: I1009 08:12:34.027504 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4ed7-account-create-s26dw"] Oct 09 08:12:34 crc kubenswrapper[4715]: I1009 08:12:34.034612 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5432-account-create-9gr67"] Oct 09 08:12:34 crc kubenswrapper[4715]: I1009 08:12:34.043289 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4ed7-account-create-s26dw"] Oct 09 08:12:34 crc kubenswrapper[4715]: I1009 08:12:34.050365 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5432-account-create-9gr67"] Oct 09 08:12:34 crc kubenswrapper[4715]: I1009 08:12:34.147256 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da55d5b-0f99-4b96-9e08-628c1961d8e8" path="/var/lib/kubelet/pods/6da55d5b-0f99-4b96-9e08-628c1961d8e8/volumes" Oct 09 08:12:34 crc kubenswrapper[4715]: I1009 08:12:34.148063 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0c78197-391b-4ebc-bc19-5bd09c64f99c" path="/var/lib/kubelet/pods/f0c78197-391b-4ebc-bc19-5bd09c64f99c/volumes" Oct 09 08:12:39 crc kubenswrapper[4715]: I1009 08:12:39.025684 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-52c5-account-create-c6qsf"] Oct 09 08:12:39 crc kubenswrapper[4715]: I1009 08:12:39.034685 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-52c5-account-create-c6qsf"] Oct 09 08:12:40 crc kubenswrapper[4715]: I1009 08:12:40.149875 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c98c9b5-1e23-4f9c-927f-b9b33da85410" path="/var/lib/kubelet/pods/3c98c9b5-1e23-4f9c-927f-b9b33da85410/volumes" Oct 09 08:12:46 crc kubenswrapper[4715]: I1009 08:12:46.753556 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:12:46 crc kubenswrapper[4715]: I1009 08:12:46.754189 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:12:59 crc kubenswrapper[4715]: I1009 08:12:59.056861 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-p7kb6"] Oct 09 08:12:59 crc kubenswrapper[4715]: I1009 08:12:59.066557 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-nzm2b"] Oct 09 08:12:59 crc kubenswrapper[4715]: I1009 08:12:59.078652 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-gjhcq"] Oct 09 08:12:59 crc kubenswrapper[4715]: I1009 08:12:59.092220 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-p7kb6"] Oct 09 08:12:59 crc kubenswrapper[4715]: I1009 08:12:59.107826 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-nzm2b"] Oct 09 08:12:59 crc kubenswrapper[4715]: I1009 08:12:59.119617 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-gjhcq"] Oct 09 08:13:00 crc kubenswrapper[4715]: I1009 08:13:00.156864 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44824bfc-2e5a-435c-82f5-7a0b29dca4c3" path="/var/lib/kubelet/pods/44824bfc-2e5a-435c-82f5-7a0b29dca4c3/volumes" Oct 09 08:13:00 crc kubenswrapper[4715]: I1009 08:13:00.157495 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7930b9ef-e4b6-4cb6-a269-d10a9194abfa" path="/var/lib/kubelet/pods/7930b9ef-e4b6-4cb6-a269-d10a9194abfa/volumes" Oct 09 08:13:00 crc kubenswrapper[4715]: I1009 08:13:00.157949 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aac330e9-1d1a-4fc8-acb9-03b85148eb00" path="/var/lib/kubelet/pods/aac330e9-1d1a-4fc8-acb9-03b85148eb00/volumes" Oct 09 08:13:05 crc kubenswrapper[4715]: I1009 08:13:05.022484 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wn48b"] Oct 09 08:13:05 crc kubenswrapper[4715]: I1009 08:13:05.029747 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wn48b"] Oct 09 08:13:06 crc kubenswrapper[4715]: I1009 08:13:06.030068 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-nkdxc"] Oct 09 08:13:06 crc kubenswrapper[4715]: I1009 08:13:06.039521 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-nkdxc"] Oct 09 08:13:06 crc kubenswrapper[4715]: I1009 08:13:06.147668 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ad22838-08c7-4400-b4b0-9cb6d6df6653" path="/var/lib/kubelet/pods/3ad22838-08c7-4400-b4b0-9cb6d6df6653/volumes" Oct 09 08:13:06 crc kubenswrapper[4715]: I1009 08:13:06.148491 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac15af92-271f-424c-bf54-e42f7771bf99" path="/var/lib/kubelet/pods/ac15af92-271f-424c-bf54-e42f7771bf99/volumes" Oct 09 08:13:11 crc kubenswrapper[4715]: I1009 08:13:11.041666 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b61f-account-create-pprts"] Oct 09 08:13:11 crc kubenswrapper[4715]: I1009 08:13:11.053196 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b61f-account-create-pprts"] Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.090493 4715 scope.go:117] "RemoveContainer" containerID="e013b45b1aac53ddebccc894959f24e90b5344201a8a01b38545a2e92e4e90cc" Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.113175 4715 scope.go:117] "RemoveContainer" containerID="ec0d3a8d53ca533d5e6d71c32953b95716c0d242d989bc1a226499e1438bca8b" Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.150320 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca10de2-6fab-417e-ac4a-ba3b73383432" path="/var/lib/kubelet/pods/cca10de2-6fab-417e-ac4a-ba3b73383432/volumes" Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.157741 4715 scope.go:117] "RemoveContainer" containerID="239e1b00cdd05cc0d43cede7d8bc3a6bed56bedcc714f44c3b2f80b05cbc8cb4" Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.194787 4715 scope.go:117] "RemoveContainer" containerID="5362293e53bcae54507d375fe4f884bf60e37eea4b24605f70f5c7963d927220" Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.240372 4715 scope.go:117] "RemoveContainer" containerID="52d9f7a6990ca3073074b27997407797407e64523142ff946c7a0e14163f10e2" Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.289281 4715 scope.go:117] "RemoveContainer" containerID="94122c7ed6b6a46d7853a8e268fa84c9314242446de34cd0a7b378533ebf86f9" Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.337486 4715 scope.go:117] "RemoveContainer" containerID="8efb562820210438a8461b6a8c0cbd66afea476d7ead5c767d4176201b05d855" Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.356190 4715 scope.go:117] "RemoveContainer" containerID="71f46bcdfeacfda0ed8880092c38bea27ab966f54738066f19ec59c10b218197" Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.373219 4715 scope.go:117] "RemoveContainer" containerID="6db4654dff7741d9067045ddca947ca3695837dda24e2659c14c16467c313775" Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.404059 4715 scope.go:117] "RemoveContainer" containerID="66a2e41bcbdbff0126fb609b0fcbb414414310e6985549a5b39618c7b777cf81" Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.420615 4715 scope.go:117] "RemoveContainer" containerID="4dd6256ff800cfcb991ebadea1a87783fdf26a641665a212aee1f0caa3ae84a4" Oct 09 08:13:12 crc kubenswrapper[4715]: I1009 08:13:12.447588 4715 scope.go:117] "RemoveContainer" containerID="a077b2411b96192943e7fd4320338b09432b45529a2a7c45615d709a98ed5295" Oct 09 08:13:16 crc kubenswrapper[4715]: I1009 08:13:16.754318 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:13:16 crc kubenswrapper[4715]: I1009 08:13:16.754633 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:13:30 crc kubenswrapper[4715]: I1009 08:13:30.042221 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-841e-account-create-vcldw"] Oct 09 08:13:30 crc kubenswrapper[4715]: I1009 08:13:30.054926 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-36e0-account-create-4qxbj"] Oct 09 08:13:30 crc kubenswrapper[4715]: I1009 08:13:30.065821 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-36e0-account-create-4qxbj"] Oct 09 08:13:30 crc kubenswrapper[4715]: I1009 08:13:30.075094 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-841e-account-create-vcldw"] Oct 09 08:13:30 crc kubenswrapper[4715]: I1009 08:13:30.150085 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd00c24-58a0-4107-a811-6d67d3156f68" path="/var/lib/kubelet/pods/3fd00c24-58a0-4107-a811-6d67d3156f68/volumes" Oct 09 08:13:30 crc kubenswrapper[4715]: I1009 08:13:30.150712 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a49e0065-af08-4379-b1e9-ac8998d2e98b" path="/var/lib/kubelet/pods/a49e0065-af08-4379-b1e9-ac8998d2e98b/volumes" Oct 09 08:13:39 crc kubenswrapper[4715]: I1009 08:13:39.583736 4715 generic.go:334] "Generic (PLEG): container finished" podID="69340195-a6f5-4e04-823d-9d61548a14b9" containerID="7fc09f8411c82f01d08719e55e7823d13dc63eeafa80f3127a3d69f0c01e57bc" exitCode=0 Oct 09 08:13:39 crc kubenswrapper[4715]: I1009 08:13:39.583800 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" event={"ID":"69340195-a6f5-4e04-823d-9d61548a14b9","Type":"ContainerDied","Data":"7fc09f8411c82f01d08719e55e7823d13dc63eeafa80f3127a3d69f0c01e57bc"} Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.016183 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.039339 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nf22z"] Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.049409 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nf22z"] Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.093574 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69340195-a6f5-4e04-823d-9d61548a14b9-inventory\") pod \"69340195-a6f5-4e04-823d-9d61548a14b9\" (UID: \"69340195-a6f5-4e04-823d-9d61548a14b9\") " Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.093666 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69340195-a6f5-4e04-823d-9d61548a14b9-ssh-key\") pod \"69340195-a6f5-4e04-823d-9d61548a14b9\" (UID: \"69340195-a6f5-4e04-823d-9d61548a14b9\") " Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.093791 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr82q\" (UniqueName: \"kubernetes.io/projected/69340195-a6f5-4e04-823d-9d61548a14b9-kube-api-access-jr82q\") pod \"69340195-a6f5-4e04-823d-9d61548a14b9\" (UID: \"69340195-a6f5-4e04-823d-9d61548a14b9\") " Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.107394 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69340195-a6f5-4e04-823d-9d61548a14b9-kube-api-access-jr82q" (OuterVolumeSpecName: "kube-api-access-jr82q") pod "69340195-a6f5-4e04-823d-9d61548a14b9" (UID: "69340195-a6f5-4e04-823d-9d61548a14b9"). InnerVolumeSpecName "kube-api-access-jr82q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.131374 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69340195-a6f5-4e04-823d-9d61548a14b9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "69340195-a6f5-4e04-823d-9d61548a14b9" (UID: "69340195-a6f5-4e04-823d-9d61548a14b9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.139192 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69340195-a6f5-4e04-823d-9d61548a14b9-inventory" (OuterVolumeSpecName: "inventory") pod "69340195-a6f5-4e04-823d-9d61548a14b9" (UID: "69340195-a6f5-4e04-823d-9d61548a14b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.196026 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr82q\" (UniqueName: \"kubernetes.io/projected/69340195-a6f5-4e04-823d-9d61548a14b9-kube-api-access-jr82q\") on node \"crc\" DevicePath \"\"" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.196065 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69340195-a6f5-4e04-823d-9d61548a14b9-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.196267 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69340195-a6f5-4e04-823d-9d61548a14b9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.606111 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" event={"ID":"69340195-a6f5-4e04-823d-9d61548a14b9","Type":"ContainerDied","Data":"cc80e5da579066922b19f05004792744f611f307946787b33ab03fd8fe696776"} Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.606171 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc80e5da579066922b19f05004792744f611f307946787b33ab03fd8fe696776" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.606192 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.728748 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss"] Oct 09 08:13:41 crc kubenswrapper[4715]: E1009 08:13:41.729730 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69340195-a6f5-4e04-823d-9d61548a14b9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.729747 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="69340195-a6f5-4e04-823d-9d61548a14b9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.730356 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="69340195-a6f5-4e04-823d-9d61548a14b9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.731602 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.736127 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.736329 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.736496 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.736675 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.752165 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss"] Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.828286 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3487ef30-efc9-46c5-8ed3-8146c9498ff0-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ddnss\" (UID: \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.828467 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h8kw\" (UniqueName: \"kubernetes.io/projected/3487ef30-efc9-46c5-8ed3-8146c9498ff0-kube-api-access-5h8kw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ddnss\" (UID: \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.828569 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3487ef30-efc9-46c5-8ed3-8146c9498ff0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ddnss\" (UID: \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.929902 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3487ef30-efc9-46c5-8ed3-8146c9498ff0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ddnss\" (UID: \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.929993 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3487ef30-efc9-46c5-8ed3-8146c9498ff0-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ddnss\" (UID: \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.930101 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h8kw\" (UniqueName: \"kubernetes.io/projected/3487ef30-efc9-46c5-8ed3-8146c9498ff0-kube-api-access-5h8kw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ddnss\" (UID: \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.935049 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3487ef30-efc9-46c5-8ed3-8146c9498ff0-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ddnss\" (UID: \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.935112 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3487ef30-efc9-46c5-8ed3-8146c9498ff0-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ddnss\" (UID: \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:41 crc kubenswrapper[4715]: I1009 08:13:41.947130 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h8kw\" (UniqueName: \"kubernetes.io/projected/3487ef30-efc9-46c5-8ed3-8146c9498ff0-kube-api-access-5h8kw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ddnss\" (UID: \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:42 crc kubenswrapper[4715]: I1009 08:13:42.059634 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:42 crc kubenswrapper[4715]: I1009 08:13:42.151319 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="189bfb70-4185-415b-a3d9-5d0ed1a76cb0" path="/var/lib/kubelet/pods/189bfb70-4185-415b-a3d9-5d0ed1a76cb0/volumes" Oct 09 08:13:42 crc kubenswrapper[4715]: I1009 08:13:42.639134 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss"] Oct 09 08:13:42 crc kubenswrapper[4715]: I1009 08:13:42.645533 4715 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 08:13:43 crc kubenswrapper[4715]: I1009 08:13:43.624911 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" event={"ID":"3487ef30-efc9-46c5-8ed3-8146c9498ff0","Type":"ContainerStarted","Data":"a74d135b7429d44a1ca84f5dd94a047292e2c73d4940b644401a9dd951799185"} Oct 09 08:13:43 crc kubenswrapper[4715]: I1009 08:13:43.624957 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" event={"ID":"3487ef30-efc9-46c5-8ed3-8146c9498ff0","Type":"ContainerStarted","Data":"4ad92047d8463774e6f38f6786652bd277a897c922e46c10c00cab6b16371d7a"} Oct 09 08:13:46 crc kubenswrapper[4715]: I1009 08:13:46.753976 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:13:46 crc kubenswrapper[4715]: I1009 08:13:46.754619 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:13:46 crc kubenswrapper[4715]: I1009 08:13:46.754722 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 08:13:46 crc kubenswrapper[4715]: I1009 08:13:46.756233 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 08:13:46 crc kubenswrapper[4715]: I1009 08:13:46.756327 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" gracePeriod=600 Oct 09 08:13:46 crc kubenswrapper[4715]: E1009 08:13:46.887492 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:13:47 crc kubenswrapper[4715]: I1009 08:13:47.659019 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" exitCode=0 Oct 09 08:13:47 crc kubenswrapper[4715]: I1009 08:13:47.659095 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c"} Oct 09 08:13:47 crc kubenswrapper[4715]: I1009 08:13:47.659376 4715 scope.go:117] "RemoveContainer" containerID="c420b2a8eda30fb3123058bb99d74df66c4ad029fca95601a27c380e6d5834c9" Oct 09 08:13:47 crc kubenswrapper[4715]: I1009 08:13:47.660009 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:13:47 crc kubenswrapper[4715]: E1009 08:13:47.660338 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:13:47 crc kubenswrapper[4715]: I1009 08:13:47.689315 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" podStartSLOduration=6.157760718 podStartE2EDuration="6.689293168s" podCreationTimestamp="2025-10-09 08:13:41 +0000 UTC" firstStartedPulling="2025-10-09 08:13:42.645248687 +0000 UTC m=+1653.338052695" lastFinishedPulling="2025-10-09 08:13:43.176781147 +0000 UTC m=+1653.869585145" observedRunningTime="2025-10-09 08:13:43.64815753 +0000 UTC m=+1654.340961568" watchObservedRunningTime="2025-10-09 08:13:47.689293168 +0000 UTC m=+1658.382097176" Oct 09 08:13:48 crc kubenswrapper[4715]: I1009 08:13:48.670410 4715 generic.go:334] "Generic (PLEG): container finished" podID="3487ef30-efc9-46c5-8ed3-8146c9498ff0" containerID="a74d135b7429d44a1ca84f5dd94a047292e2c73d4940b644401a9dd951799185" exitCode=0 Oct 09 08:13:48 crc kubenswrapper[4715]: I1009 08:13:48.670494 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" event={"ID":"3487ef30-efc9-46c5-8ed3-8146c9498ff0","Type":"ContainerDied","Data":"a74d135b7429d44a1ca84f5dd94a047292e2c73d4940b644401a9dd951799185"} Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.223567 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.283757 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3487ef30-efc9-46c5-8ed3-8146c9498ff0-inventory\") pod \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\" (UID: \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\") " Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.283895 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3487ef30-efc9-46c5-8ed3-8146c9498ff0-ssh-key\") pod \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\" (UID: \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\") " Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.283970 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h8kw\" (UniqueName: \"kubernetes.io/projected/3487ef30-efc9-46c5-8ed3-8146c9498ff0-kube-api-access-5h8kw\") pod \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\" (UID: \"3487ef30-efc9-46c5-8ed3-8146c9498ff0\") " Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.295769 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3487ef30-efc9-46c5-8ed3-8146c9498ff0-kube-api-access-5h8kw" (OuterVolumeSpecName: "kube-api-access-5h8kw") pod "3487ef30-efc9-46c5-8ed3-8146c9498ff0" (UID: "3487ef30-efc9-46c5-8ed3-8146c9498ff0"). InnerVolumeSpecName "kube-api-access-5h8kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.311928 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3487ef30-efc9-46c5-8ed3-8146c9498ff0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3487ef30-efc9-46c5-8ed3-8146c9498ff0" (UID: "3487ef30-efc9-46c5-8ed3-8146c9498ff0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.311952 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3487ef30-efc9-46c5-8ed3-8146c9498ff0-inventory" (OuterVolumeSpecName: "inventory") pod "3487ef30-efc9-46c5-8ed3-8146c9498ff0" (UID: "3487ef30-efc9-46c5-8ed3-8146c9498ff0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.385894 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h8kw\" (UniqueName: \"kubernetes.io/projected/3487ef30-efc9-46c5-8ed3-8146c9498ff0-kube-api-access-5h8kw\") on node \"crc\" DevicePath \"\"" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.385922 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3487ef30-efc9-46c5-8ed3-8146c9498ff0-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.385931 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3487ef30-efc9-46c5-8ed3-8146c9498ff0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.693144 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" event={"ID":"3487ef30-efc9-46c5-8ed3-8146c9498ff0","Type":"ContainerDied","Data":"4ad92047d8463774e6f38f6786652bd277a897c922e46c10c00cab6b16371d7a"} Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.693186 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ad92047d8463774e6f38f6786652bd277a897c922e46c10c00cab6b16371d7a" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.693195 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ddnss" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.761483 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g"] Oct 09 08:13:50 crc kubenswrapper[4715]: E1009 08:13:50.761899 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3487ef30-efc9-46c5-8ed3-8146c9498ff0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.761923 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="3487ef30-efc9-46c5-8ed3-8146c9498ff0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.762133 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="3487ef30-efc9-46c5-8ed3-8146c9498ff0" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.763199 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.766452 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.766591 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.767587 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.767708 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.778229 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g"] Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.895751 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7ps4g\" (UID: \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.895832 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7ps4g\" (UID: \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.895967 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4cpx\" (UniqueName: \"kubernetes.io/projected/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-kube-api-access-r4cpx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7ps4g\" (UID: \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.997626 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7ps4g\" (UID: \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.997694 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7ps4g\" (UID: \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:13:50 crc kubenswrapper[4715]: I1009 08:13:50.997727 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4cpx\" (UniqueName: \"kubernetes.io/projected/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-kube-api-access-r4cpx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7ps4g\" (UID: \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:13:51 crc kubenswrapper[4715]: I1009 08:13:51.002021 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7ps4g\" (UID: \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:13:51 crc kubenswrapper[4715]: I1009 08:13:51.015761 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7ps4g\" (UID: \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:13:51 crc kubenswrapper[4715]: I1009 08:13:51.033672 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4cpx\" (UniqueName: \"kubernetes.io/projected/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-kube-api-access-r4cpx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7ps4g\" (UID: \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:13:51 crc kubenswrapper[4715]: I1009 08:13:51.088829 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:13:51 crc kubenswrapper[4715]: I1009 08:13:51.611636 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g"] Oct 09 08:13:51 crc kubenswrapper[4715]: I1009 08:13:51.702598 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" event={"ID":"bf57a3ae-e445-4a20-9bc1-c5c8480f9158","Type":"ContainerStarted","Data":"2c8e290d3448877e85cee4394ac4ce0f352956f134e376cb729e1459e1d23b17"} Oct 09 08:13:52 crc kubenswrapper[4715]: I1009 08:13:52.711084 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" event={"ID":"bf57a3ae-e445-4a20-9bc1-c5c8480f9158","Type":"ContainerStarted","Data":"30fa682d8216c66672548a89ebad6223728e1353ecb37e800deb700b22dc63b4"} Oct 09 08:13:52 crc kubenswrapper[4715]: I1009 08:13:52.732409 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" podStartSLOduration=2.174940822 podStartE2EDuration="2.732388721s" podCreationTimestamp="2025-10-09 08:13:50 +0000 UTC" firstStartedPulling="2025-10-09 08:13:51.619299725 +0000 UTC m=+1662.312103733" lastFinishedPulling="2025-10-09 08:13:52.176747624 +0000 UTC m=+1662.869551632" observedRunningTime="2025-10-09 08:13:52.725931627 +0000 UTC m=+1663.418735655" watchObservedRunningTime="2025-10-09 08:13:52.732388721 +0000 UTC m=+1663.425192729" Oct 09 08:14:00 crc kubenswrapper[4715]: I1009 08:14:00.136945 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:14:00 crc kubenswrapper[4715]: E1009 08:14:00.137710 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:14:01 crc kubenswrapper[4715]: I1009 08:14:01.053086 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-4tnj8"] Oct 09 08:14:01 crc kubenswrapper[4715]: I1009 08:14:01.062951 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-4tnj8"] Oct 09 08:14:02 crc kubenswrapper[4715]: I1009 08:14:02.150539 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80ba490d-aaff-4579-bc8a-ffaa4924c7b7" path="/var/lib/kubelet/pods/80ba490d-aaff-4579-bc8a-ffaa4924c7b7/volumes" Oct 09 08:14:09 crc kubenswrapper[4715]: I1009 08:14:09.037435 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-cf8st"] Oct 09 08:14:09 crc kubenswrapper[4715]: I1009 08:14:09.044448 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-cf8st"] Oct 09 08:14:10 crc kubenswrapper[4715]: I1009 08:14:10.147165 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e65240-0972-4024-a140-425dda8cfa12" path="/var/lib/kubelet/pods/f4e65240-0972-4024-a140-425dda8cfa12/volumes" Oct 09 08:14:12 crc kubenswrapper[4715]: I1009 08:14:12.644810 4715 scope.go:117] "RemoveContainer" containerID="5da445758a62da0974a02ce3720b58b9a61359d0cec40acebf20e3f8fac21a3f" Oct 09 08:14:12 crc kubenswrapper[4715]: I1009 08:14:12.711728 4715 scope.go:117] "RemoveContainer" containerID="eebf4904b85681a1515e9c7904f9fc8fe93f2b1f2f3291223dc4c8637ea7c5c9" Oct 09 08:14:12 crc kubenswrapper[4715]: I1009 08:14:12.739868 4715 scope.go:117] "RemoveContainer" containerID="e2be2a811cd6f5a203c67031da13097a046626e4d8037f8db0c8d330f3721c3b" Oct 09 08:14:12 crc kubenswrapper[4715]: I1009 08:14:12.840703 4715 scope.go:117] "RemoveContainer" containerID="4ac55fb52f747d08dc4e6bc380b797c8d2df593c94212a82f0b90bdf9650ecc9" Oct 09 08:14:12 crc kubenswrapper[4715]: I1009 08:14:12.905115 4715 scope.go:117] "RemoveContainer" containerID="f7d0fb161ef47e63c7e2bbb5c808b7c1eea2f502998dab8cd2fc974db01f91dc" Oct 09 08:14:13 crc kubenswrapper[4715]: I1009 08:14:13.137128 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:14:13 crc kubenswrapper[4715]: E1009 08:14:13.137617 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:14:14 crc kubenswrapper[4715]: I1009 08:14:14.034244 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jgbfr"] Oct 09 08:14:14 crc kubenswrapper[4715]: I1009 08:14:14.045222 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jgbfr"] Oct 09 08:14:14 crc kubenswrapper[4715]: I1009 08:14:14.148113 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f4dee6e-f935-4bdd-9138-d414e86c0fa2" path="/var/lib/kubelet/pods/2f4dee6e-f935-4bdd-9138-d414e86c0fa2/volumes" Oct 09 08:14:17 crc kubenswrapper[4715]: I1009 08:14:17.028825 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-l7r5w"] Oct 09 08:14:17 crc kubenswrapper[4715]: I1009 08:14:17.040831 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-l7r5w"] Oct 09 08:14:18 crc kubenswrapper[4715]: I1009 08:14:18.147859 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="027df64c-f87d-401f-965c-88c874a854f8" path="/var/lib/kubelet/pods/027df64c-f87d-401f-965c-88c874a854f8/volumes" Oct 09 08:14:26 crc kubenswrapper[4715]: I1009 08:14:26.137877 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:14:26 crc kubenswrapper[4715]: E1009 08:14:26.138906 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:14:29 crc kubenswrapper[4715]: I1009 08:14:29.073694 4715 generic.go:334] "Generic (PLEG): container finished" podID="bf57a3ae-e445-4a20-9bc1-c5c8480f9158" containerID="30fa682d8216c66672548a89ebad6223728e1353ecb37e800deb700b22dc63b4" exitCode=0 Oct 09 08:14:29 crc kubenswrapper[4715]: I1009 08:14:29.073812 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" event={"ID":"bf57a3ae-e445-4a20-9bc1-c5c8480f9158","Type":"ContainerDied","Data":"30fa682d8216c66672548a89ebad6223728e1353ecb37e800deb700b22dc63b4"} Oct 09 08:14:30 crc kubenswrapper[4715]: I1009 08:14:30.505530 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:14:30 crc kubenswrapper[4715]: I1009 08:14:30.550008 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-inventory\") pod \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\" (UID: \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\") " Oct 09 08:14:30 crc kubenswrapper[4715]: I1009 08:14:30.550120 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-ssh-key\") pod \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\" (UID: \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\") " Oct 09 08:14:30 crc kubenswrapper[4715]: I1009 08:14:30.550280 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4cpx\" (UniqueName: \"kubernetes.io/projected/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-kube-api-access-r4cpx\") pod \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\" (UID: \"bf57a3ae-e445-4a20-9bc1-c5c8480f9158\") " Oct 09 08:14:30 crc kubenswrapper[4715]: I1009 08:14:30.560658 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-kube-api-access-r4cpx" (OuterVolumeSpecName: "kube-api-access-r4cpx") pod "bf57a3ae-e445-4a20-9bc1-c5c8480f9158" (UID: "bf57a3ae-e445-4a20-9bc1-c5c8480f9158"). InnerVolumeSpecName "kube-api-access-r4cpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:14:30 crc kubenswrapper[4715]: I1009 08:14:30.579168 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-inventory" (OuterVolumeSpecName: "inventory") pod "bf57a3ae-e445-4a20-9bc1-c5c8480f9158" (UID: "bf57a3ae-e445-4a20-9bc1-c5c8480f9158"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:14:30 crc kubenswrapper[4715]: I1009 08:14:30.579594 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bf57a3ae-e445-4a20-9bc1-c5c8480f9158" (UID: "bf57a3ae-e445-4a20-9bc1-c5c8480f9158"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:14:30 crc kubenswrapper[4715]: I1009 08:14:30.653343 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:14:30 crc kubenswrapper[4715]: I1009 08:14:30.653401 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4cpx\" (UniqueName: \"kubernetes.io/projected/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-kube-api-access-r4cpx\") on node \"crc\" DevicePath \"\"" Oct 09 08:14:30 crc kubenswrapper[4715]: I1009 08:14:30.653426 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf57a3ae-e445-4a20-9bc1-c5c8480f9158-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.090589 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" event={"ID":"bf57a3ae-e445-4a20-9bc1-c5c8480f9158","Type":"ContainerDied","Data":"2c8e290d3448877e85cee4394ac4ce0f352956f134e376cb729e1459e1d23b17"} Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.090915 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c8e290d3448877e85cee4394ac4ce0f352956f134e376cb729e1459e1d23b17" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.090635 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7ps4g" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.176135 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv"] Oct 09 08:14:31 crc kubenswrapper[4715]: E1009 08:14:31.176542 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf57a3ae-e445-4a20-9bc1-c5c8480f9158" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.176559 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf57a3ae-e445-4a20-9bc1-c5c8480f9158" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.176772 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf57a3ae-e445-4a20-9bc1-c5c8480f9158" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.177463 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.180040 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.180068 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.180179 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.180516 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.185704 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv"] Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.269323 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4220a571-fb05-4098-901b-a00a6c79efe8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv\" (UID: \"4220a571-fb05-4098-901b-a00a6c79efe8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.269617 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxd8z\" (UniqueName: \"kubernetes.io/projected/4220a571-fb05-4098-901b-a00a6c79efe8-kube-api-access-wxd8z\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv\" (UID: \"4220a571-fb05-4098-901b-a00a6c79efe8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.269719 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4220a571-fb05-4098-901b-a00a6c79efe8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv\" (UID: \"4220a571-fb05-4098-901b-a00a6c79efe8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.372543 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4220a571-fb05-4098-901b-a00a6c79efe8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv\" (UID: \"4220a571-fb05-4098-901b-a00a6c79efe8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.373071 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxd8z\" (UniqueName: \"kubernetes.io/projected/4220a571-fb05-4098-901b-a00a6c79efe8-kube-api-access-wxd8z\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv\" (UID: \"4220a571-fb05-4098-901b-a00a6c79efe8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.373344 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4220a571-fb05-4098-901b-a00a6c79efe8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv\" (UID: \"4220a571-fb05-4098-901b-a00a6c79efe8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.376707 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4220a571-fb05-4098-901b-a00a6c79efe8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv\" (UID: \"4220a571-fb05-4098-901b-a00a6c79efe8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.380272 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4220a571-fb05-4098-901b-a00a6c79efe8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv\" (UID: \"4220a571-fb05-4098-901b-a00a6c79efe8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.398915 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxd8z\" (UniqueName: \"kubernetes.io/projected/4220a571-fb05-4098-901b-a00a6c79efe8-kube-api-access-wxd8z\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv\" (UID: \"4220a571-fb05-4098-901b-a00a6c79efe8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.498104 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:14:31 crc kubenswrapper[4715]: I1009 08:14:31.819870 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv"] Oct 09 08:14:32 crc kubenswrapper[4715]: I1009 08:14:32.103503 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" event={"ID":"4220a571-fb05-4098-901b-a00a6c79efe8","Type":"ContainerStarted","Data":"0df9ccb18a50a050d71b96c6ab73fd81503f9b6127b01eed0287da48909471da"} Oct 09 08:14:33 crc kubenswrapper[4715]: I1009 08:14:33.113394 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" event={"ID":"4220a571-fb05-4098-901b-a00a6c79efe8","Type":"ContainerStarted","Data":"cb395328d17d211d392588a01cd3b9e1bae88950774d65c4beb48f38d4a579ee"} Oct 09 08:14:33 crc kubenswrapper[4715]: I1009 08:14:33.134392 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" podStartSLOduration=1.663571358 podStartE2EDuration="2.134369136s" podCreationTimestamp="2025-10-09 08:14:31 +0000 UTC" firstStartedPulling="2025-10-09 08:14:31.823700624 +0000 UTC m=+1702.516504642" lastFinishedPulling="2025-10-09 08:14:32.294498412 +0000 UTC m=+1702.987302420" observedRunningTime="2025-10-09 08:14:33.127737657 +0000 UTC m=+1703.820541665" watchObservedRunningTime="2025-10-09 08:14:33.134369136 +0000 UTC m=+1703.827173144" Oct 09 08:14:37 crc kubenswrapper[4715]: I1009 08:14:37.137451 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:14:37 crc kubenswrapper[4715]: E1009 08:14:37.138244 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:14:41 crc kubenswrapper[4715]: I1009 08:14:41.062847 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-kp8jh"] Oct 09 08:14:41 crc kubenswrapper[4715]: I1009 08:14:41.082484 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vtjlz"] Oct 09 08:14:41 crc kubenswrapper[4715]: I1009 08:14:41.092377 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-kp8jh"] Oct 09 08:14:41 crc kubenswrapper[4715]: I1009 08:14:41.099329 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vtjlz"] Oct 09 08:14:41 crc kubenswrapper[4715]: I1009 08:14:41.105658 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-28p9v"] Oct 09 08:14:41 crc kubenswrapper[4715]: I1009 08:14:41.111860 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-28p9v"] Oct 09 08:14:42 crc kubenswrapper[4715]: I1009 08:14:42.155317 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12f445bb-022c-4dd9-8f91-e9612f526a12" path="/var/lib/kubelet/pods/12f445bb-022c-4dd9-8f91-e9612f526a12/volumes" Oct 09 08:14:42 crc kubenswrapper[4715]: I1009 08:14:42.156375 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="544d96bc-6a19-46c5-8162-64a99e333681" path="/var/lib/kubelet/pods/544d96bc-6a19-46c5-8162-64a99e333681/volumes" Oct 09 08:14:42 crc kubenswrapper[4715]: I1009 08:14:42.157510 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80579efc-c70d-41a5-8a56-922ca09a8bd4" path="/var/lib/kubelet/pods/80579efc-c70d-41a5-8a56-922ca09a8bd4/volumes" Oct 09 08:14:50 crc kubenswrapper[4715]: I1009 08:14:50.142047 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:14:50 crc kubenswrapper[4715]: E1009 08:14:50.142834 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:14:56 crc kubenswrapper[4715]: I1009 08:14:56.041565 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4765-account-create-49j2w"] Oct 09 08:14:56 crc kubenswrapper[4715]: I1009 08:14:56.058156 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4765-account-create-49j2w"] Oct 09 08:14:56 crc kubenswrapper[4715]: I1009 08:14:56.068372 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c146-account-create-gbjbj"] Oct 09 08:14:56 crc kubenswrapper[4715]: I1009 08:14:56.081109 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c146-account-create-gbjbj"] Oct 09 08:14:56 crc kubenswrapper[4715]: I1009 08:14:56.150983 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98e40342-58d1-4d38-9422-ade8716c4a55" path="/var/lib/kubelet/pods/98e40342-58d1-4d38-9422-ade8716c4a55/volumes" Oct 09 08:14:56 crc kubenswrapper[4715]: I1009 08:14:56.152619 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb3c1ec-082e-4c72-b922-fee185aa0b44" path="/var/lib/kubelet/pods/9bb3c1ec-082e-4c72-b922-fee185aa0b44/volumes" Oct 09 08:14:57 crc kubenswrapper[4715]: I1009 08:14:57.028969 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-af8d-account-create-4sbml"] Oct 09 08:14:57 crc kubenswrapper[4715]: I1009 08:14:57.036806 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-af8d-account-create-4sbml"] Oct 09 08:14:58 crc kubenswrapper[4715]: I1009 08:14:58.150287 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13eea75-c5e8-49f0-99c0-def950b3e0fa" path="/var/lib/kubelet/pods/d13eea75-c5e8-49f0-99c0-def950b3e0fa/volumes" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.157620 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272"] Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.159145 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.163120 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.164557 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272"] Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.175279 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.325990 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad7a641d-a83c-439c-8fc0-538811f76494-secret-volume\") pod \"collect-profiles-29333295-4n272\" (UID: \"ad7a641d-a83c-439c-8fc0-538811f76494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.326093 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h4vz\" (UniqueName: \"kubernetes.io/projected/ad7a641d-a83c-439c-8fc0-538811f76494-kube-api-access-2h4vz\") pod \"collect-profiles-29333295-4n272\" (UID: \"ad7a641d-a83c-439c-8fc0-538811f76494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.326324 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad7a641d-a83c-439c-8fc0-538811f76494-config-volume\") pod \"collect-profiles-29333295-4n272\" (UID: \"ad7a641d-a83c-439c-8fc0-538811f76494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.427895 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad7a641d-a83c-439c-8fc0-538811f76494-secret-volume\") pod \"collect-profiles-29333295-4n272\" (UID: \"ad7a641d-a83c-439c-8fc0-538811f76494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.427997 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h4vz\" (UniqueName: \"kubernetes.io/projected/ad7a641d-a83c-439c-8fc0-538811f76494-kube-api-access-2h4vz\") pod \"collect-profiles-29333295-4n272\" (UID: \"ad7a641d-a83c-439c-8fc0-538811f76494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.428062 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad7a641d-a83c-439c-8fc0-538811f76494-config-volume\") pod \"collect-profiles-29333295-4n272\" (UID: \"ad7a641d-a83c-439c-8fc0-538811f76494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.428951 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad7a641d-a83c-439c-8fc0-538811f76494-config-volume\") pod \"collect-profiles-29333295-4n272\" (UID: \"ad7a641d-a83c-439c-8fc0-538811f76494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.437057 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad7a641d-a83c-439c-8fc0-538811f76494-secret-volume\") pod \"collect-profiles-29333295-4n272\" (UID: \"ad7a641d-a83c-439c-8fc0-538811f76494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.452592 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h4vz\" (UniqueName: \"kubernetes.io/projected/ad7a641d-a83c-439c-8fc0-538811f76494-kube-api-access-2h4vz\") pod \"collect-profiles-29333295-4n272\" (UID: \"ad7a641d-a83c-439c-8fc0-538811f76494\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.485514 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:00 crc kubenswrapper[4715]: I1009 08:15:00.941719 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272"] Oct 09 08:15:01 crc kubenswrapper[4715]: I1009 08:15:01.137257 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:15:01 crc kubenswrapper[4715]: E1009 08:15:01.138017 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:15:01 crc kubenswrapper[4715]: I1009 08:15:01.366515 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" event={"ID":"ad7a641d-a83c-439c-8fc0-538811f76494","Type":"ContainerStarted","Data":"975bacefaa3b881695711405204b44ce9260d4d0fcffe9987002b0937cf2f049"} Oct 09 08:15:01 crc kubenswrapper[4715]: I1009 08:15:01.367598 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" event={"ID":"ad7a641d-a83c-439c-8fc0-538811f76494","Type":"ContainerStarted","Data":"007ab54662e0ebb68ebdcf6647dd8d5c7d511696abb90b054bb71aca6fd4f938"} Oct 09 08:15:01 crc kubenswrapper[4715]: I1009 08:15:01.394883 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" podStartSLOduration=1.3948562820000001 podStartE2EDuration="1.394856282s" podCreationTimestamp="2025-10-09 08:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:15:01.38459357 +0000 UTC m=+1732.077397588" watchObservedRunningTime="2025-10-09 08:15:01.394856282 +0000 UTC m=+1732.087660290" Oct 09 08:15:02 crc kubenswrapper[4715]: I1009 08:15:02.380079 4715 generic.go:334] "Generic (PLEG): container finished" podID="ad7a641d-a83c-439c-8fc0-538811f76494" containerID="975bacefaa3b881695711405204b44ce9260d4d0fcffe9987002b0937cf2f049" exitCode=0 Oct 09 08:15:02 crc kubenswrapper[4715]: I1009 08:15:02.380219 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" event={"ID":"ad7a641d-a83c-439c-8fc0-538811f76494","Type":"ContainerDied","Data":"975bacefaa3b881695711405204b44ce9260d4d0fcffe9987002b0937cf2f049"} Oct 09 08:15:03 crc kubenswrapper[4715]: I1009 08:15:03.709591 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:03 crc kubenswrapper[4715]: I1009 08:15:03.895025 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad7a641d-a83c-439c-8fc0-538811f76494-secret-volume\") pod \"ad7a641d-a83c-439c-8fc0-538811f76494\" (UID: \"ad7a641d-a83c-439c-8fc0-538811f76494\") " Oct 09 08:15:03 crc kubenswrapper[4715]: I1009 08:15:03.895121 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad7a641d-a83c-439c-8fc0-538811f76494-config-volume\") pod \"ad7a641d-a83c-439c-8fc0-538811f76494\" (UID: \"ad7a641d-a83c-439c-8fc0-538811f76494\") " Oct 09 08:15:03 crc kubenswrapper[4715]: I1009 08:15:03.895295 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h4vz\" (UniqueName: \"kubernetes.io/projected/ad7a641d-a83c-439c-8fc0-538811f76494-kube-api-access-2h4vz\") pod \"ad7a641d-a83c-439c-8fc0-538811f76494\" (UID: \"ad7a641d-a83c-439c-8fc0-538811f76494\") " Oct 09 08:15:03 crc kubenswrapper[4715]: I1009 08:15:03.896122 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad7a641d-a83c-439c-8fc0-538811f76494-config-volume" (OuterVolumeSpecName: "config-volume") pod "ad7a641d-a83c-439c-8fc0-538811f76494" (UID: "ad7a641d-a83c-439c-8fc0-538811f76494"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:15:03 crc kubenswrapper[4715]: I1009 08:15:03.901093 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad7a641d-a83c-439c-8fc0-538811f76494-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ad7a641d-a83c-439c-8fc0-538811f76494" (UID: "ad7a641d-a83c-439c-8fc0-538811f76494"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:15:03 crc kubenswrapper[4715]: I1009 08:15:03.901230 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad7a641d-a83c-439c-8fc0-538811f76494-kube-api-access-2h4vz" (OuterVolumeSpecName: "kube-api-access-2h4vz") pod "ad7a641d-a83c-439c-8fc0-538811f76494" (UID: "ad7a641d-a83c-439c-8fc0-538811f76494"). InnerVolumeSpecName "kube-api-access-2h4vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:15:03 crc kubenswrapper[4715]: I1009 08:15:03.998185 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h4vz\" (UniqueName: \"kubernetes.io/projected/ad7a641d-a83c-439c-8fc0-538811f76494-kube-api-access-2h4vz\") on node \"crc\" DevicePath \"\"" Oct 09 08:15:03 crc kubenswrapper[4715]: I1009 08:15:03.998238 4715 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad7a641d-a83c-439c-8fc0-538811f76494-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 08:15:03 crc kubenswrapper[4715]: I1009 08:15:03.998255 4715 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad7a641d-a83c-439c-8fc0-538811f76494-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 08:15:04 crc kubenswrapper[4715]: I1009 08:15:04.398383 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" event={"ID":"ad7a641d-a83c-439c-8fc0-538811f76494","Type":"ContainerDied","Data":"007ab54662e0ebb68ebdcf6647dd8d5c7d511696abb90b054bb71aca6fd4f938"} Oct 09 08:15:04 crc kubenswrapper[4715]: I1009 08:15:04.398483 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="007ab54662e0ebb68ebdcf6647dd8d5c7d511696abb90b054bb71aca6fd4f938" Oct 09 08:15:04 crc kubenswrapper[4715]: I1009 08:15:04.398568 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333295-4n272" Oct 09 08:15:13 crc kubenswrapper[4715]: I1009 08:15:13.002612 4715 scope.go:117] "RemoveContainer" containerID="d4dfdcdbb128b61403d55abeaa594c743967e826fccde846e76205b760e1824b" Oct 09 08:15:13 crc kubenswrapper[4715]: I1009 08:15:13.037900 4715 scope.go:117] "RemoveContainer" containerID="d9e29a16fff26e5e088cbcc225e4d1032062bd08e096873954e1ba3dec79acd9" Oct 09 08:15:13 crc kubenswrapper[4715]: I1009 08:15:13.120054 4715 scope.go:117] "RemoveContainer" containerID="23d13f798740ab459a6412176221b61828a195e988a13fbe174d4f7a7c8b3bf6" Oct 09 08:15:13 crc kubenswrapper[4715]: I1009 08:15:13.178037 4715 scope.go:117] "RemoveContainer" containerID="e99e3d7744469aa877b76e70adf0f30179857d5cdf8bb9c2392c0aade85d5d58" Oct 09 08:15:13 crc kubenswrapper[4715]: I1009 08:15:13.228167 4715 scope.go:117] "RemoveContainer" containerID="feaef52a52046c5f6a86f65074c198fc365442cade6012619b300e3742d28638" Oct 09 08:15:13 crc kubenswrapper[4715]: I1009 08:15:13.311799 4715 scope.go:117] "RemoveContainer" containerID="44bbd165b0581eccdab58e0d133dd9465cdd387dacb7d18d5d29576511c8478d" Oct 09 08:15:13 crc kubenswrapper[4715]: I1009 08:15:13.341792 4715 scope.go:117] "RemoveContainer" containerID="b8ed99a425aa9c369f9b95b06ae7d8299eff1ee29789b000aa1a34966e3d750b" Oct 09 08:15:13 crc kubenswrapper[4715]: I1009 08:15:13.379701 4715 scope.go:117] "RemoveContainer" containerID="371a8fa3d04735ed76bce22928354da20b59bb0f2ca96fad013668dab63567fd" Oct 09 08:15:14 crc kubenswrapper[4715]: I1009 08:15:14.136918 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:15:14 crc kubenswrapper[4715]: E1009 08:15:14.137521 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:15:19 crc kubenswrapper[4715]: I1009 08:15:19.066112 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4vnss"] Oct 09 08:15:19 crc kubenswrapper[4715]: I1009 08:15:19.081468 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4vnss"] Oct 09 08:15:20 crc kubenswrapper[4715]: I1009 08:15:20.153060 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9997aadd-2c07-463c-b694-657dbd229eaf" path="/var/lib/kubelet/pods/9997aadd-2c07-463c-b694-657dbd229eaf/volumes" Oct 09 08:15:28 crc kubenswrapper[4715]: I1009 08:15:28.137600 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:15:28 crc kubenswrapper[4715]: E1009 08:15:28.139577 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:15:28 crc kubenswrapper[4715]: I1009 08:15:28.666932 4715 generic.go:334] "Generic (PLEG): container finished" podID="4220a571-fb05-4098-901b-a00a6c79efe8" containerID="cb395328d17d211d392588a01cd3b9e1bae88950774d65c4beb48f38d4a579ee" exitCode=2 Oct 09 08:15:28 crc kubenswrapper[4715]: I1009 08:15:28.666982 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" event={"ID":"4220a571-fb05-4098-901b-a00a6c79efe8","Type":"ContainerDied","Data":"cb395328d17d211d392588a01cd3b9e1bae88950774d65c4beb48f38d4a579ee"} Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.094544 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.118012 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxd8z\" (UniqueName: \"kubernetes.io/projected/4220a571-fb05-4098-901b-a00a6c79efe8-kube-api-access-wxd8z\") pod \"4220a571-fb05-4098-901b-a00a6c79efe8\" (UID: \"4220a571-fb05-4098-901b-a00a6c79efe8\") " Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.118204 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4220a571-fb05-4098-901b-a00a6c79efe8-ssh-key\") pod \"4220a571-fb05-4098-901b-a00a6c79efe8\" (UID: \"4220a571-fb05-4098-901b-a00a6c79efe8\") " Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.118383 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4220a571-fb05-4098-901b-a00a6c79efe8-inventory\") pod \"4220a571-fb05-4098-901b-a00a6c79efe8\" (UID: \"4220a571-fb05-4098-901b-a00a6c79efe8\") " Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.124288 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4220a571-fb05-4098-901b-a00a6c79efe8-kube-api-access-wxd8z" (OuterVolumeSpecName: "kube-api-access-wxd8z") pod "4220a571-fb05-4098-901b-a00a6c79efe8" (UID: "4220a571-fb05-4098-901b-a00a6c79efe8"). InnerVolumeSpecName "kube-api-access-wxd8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.149626 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4220a571-fb05-4098-901b-a00a6c79efe8-inventory" (OuterVolumeSpecName: "inventory") pod "4220a571-fb05-4098-901b-a00a6c79efe8" (UID: "4220a571-fb05-4098-901b-a00a6c79efe8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.156210 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4220a571-fb05-4098-901b-a00a6c79efe8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4220a571-fb05-4098-901b-a00a6c79efe8" (UID: "4220a571-fb05-4098-901b-a00a6c79efe8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.221617 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxd8z\" (UniqueName: \"kubernetes.io/projected/4220a571-fb05-4098-901b-a00a6c79efe8-kube-api-access-wxd8z\") on node \"crc\" DevicePath \"\"" Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.221650 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4220a571-fb05-4098-901b-a00a6c79efe8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.221664 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4220a571-fb05-4098-901b-a00a6c79efe8-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.691727 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" event={"ID":"4220a571-fb05-4098-901b-a00a6c79efe8","Type":"ContainerDied","Data":"0df9ccb18a50a050d71b96c6ab73fd81503f9b6127b01eed0287da48909471da"} Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.691803 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0df9ccb18a50a050d71b96c6ab73fd81503f9b6127b01eed0287da48909471da" Oct 09 08:15:30 crc kubenswrapper[4715]: I1009 08:15:30.691803 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.027630 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t"] Oct 09 08:15:37 crc kubenswrapper[4715]: E1009 08:15:37.028604 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4220a571-fb05-4098-901b-a00a6c79efe8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.028621 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="4220a571-fb05-4098-901b-a00a6c79efe8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:15:37 crc kubenswrapper[4715]: E1009 08:15:37.028639 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad7a641d-a83c-439c-8fc0-538811f76494" containerName="collect-profiles" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.028647 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad7a641d-a83c-439c-8fc0-538811f76494" containerName="collect-profiles" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.028900 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad7a641d-a83c-439c-8fc0-538811f76494" containerName="collect-profiles" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.028918 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="4220a571-fb05-4098-901b-a00a6c79efe8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.029683 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.033889 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.034222 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.035027 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.035730 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.038311 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t"] Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.088892 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d2dbb06-3154-428b-991c-09b567c9136e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t\" (UID: \"6d2dbb06-3154-428b-991c-09b567c9136e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.088952 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9sxp\" (UniqueName: \"kubernetes.io/projected/6d2dbb06-3154-428b-991c-09b567c9136e-kube-api-access-f9sxp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t\" (UID: \"6d2dbb06-3154-428b-991c-09b567c9136e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.089355 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d2dbb06-3154-428b-991c-09b567c9136e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t\" (UID: \"6d2dbb06-3154-428b-991c-09b567c9136e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.191631 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d2dbb06-3154-428b-991c-09b567c9136e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t\" (UID: \"6d2dbb06-3154-428b-991c-09b567c9136e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.191734 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d2dbb06-3154-428b-991c-09b567c9136e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t\" (UID: \"6d2dbb06-3154-428b-991c-09b567c9136e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.191757 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9sxp\" (UniqueName: \"kubernetes.io/projected/6d2dbb06-3154-428b-991c-09b567c9136e-kube-api-access-f9sxp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t\" (UID: \"6d2dbb06-3154-428b-991c-09b567c9136e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.198260 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d2dbb06-3154-428b-991c-09b567c9136e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t\" (UID: \"6d2dbb06-3154-428b-991c-09b567c9136e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.199029 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d2dbb06-3154-428b-991c-09b567c9136e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t\" (UID: \"6d2dbb06-3154-428b-991c-09b567c9136e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.211045 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9sxp\" (UniqueName: \"kubernetes.io/projected/6d2dbb06-3154-428b-991c-09b567c9136e-kube-api-access-f9sxp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t\" (UID: \"6d2dbb06-3154-428b-991c-09b567c9136e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.353461 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:15:37 crc kubenswrapper[4715]: W1009 08:15:37.842476 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d2dbb06_3154_428b_991c_09b567c9136e.slice/crio-2e8be081fda357247d424a41701e14c8ac0582b932e56e756fba1bfeaf0379c2 WatchSource:0}: Error finding container 2e8be081fda357247d424a41701e14c8ac0582b932e56e756fba1bfeaf0379c2: Status 404 returned error can't find the container with id 2e8be081fda357247d424a41701e14c8ac0582b932e56e756fba1bfeaf0379c2 Oct 09 08:15:37 crc kubenswrapper[4715]: I1009 08:15:37.847184 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t"] Oct 09 08:15:38 crc kubenswrapper[4715]: I1009 08:15:38.037834 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4tzm2"] Oct 09 08:15:38 crc kubenswrapper[4715]: I1009 08:15:38.044291 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-bmknt"] Oct 09 08:15:38 crc kubenswrapper[4715]: I1009 08:15:38.051943 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4tzm2"] Oct 09 08:15:38 crc kubenswrapper[4715]: I1009 08:15:38.059039 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-bmknt"] Oct 09 08:15:38 crc kubenswrapper[4715]: I1009 08:15:38.146023 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02682196-5950-427c-908a-bb791173de68" path="/var/lib/kubelet/pods/02682196-5950-427c-908a-bb791173de68/volumes" Oct 09 08:15:38 crc kubenswrapper[4715]: I1009 08:15:38.146591 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0884103-2ca9-41fa-94ab-19ce6ba49364" path="/var/lib/kubelet/pods/a0884103-2ca9-41fa-94ab-19ce6ba49364/volumes" Oct 09 08:15:38 crc kubenswrapper[4715]: I1009 08:15:38.763579 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" event={"ID":"6d2dbb06-3154-428b-991c-09b567c9136e","Type":"ContainerStarted","Data":"02b9173251bc4ea5b059d09d9ad88384a06144b60e3465d84519f3d6ac06ba04"} Oct 09 08:15:38 crc kubenswrapper[4715]: I1009 08:15:38.763858 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" event={"ID":"6d2dbb06-3154-428b-991c-09b567c9136e","Type":"ContainerStarted","Data":"2e8be081fda357247d424a41701e14c8ac0582b932e56e756fba1bfeaf0379c2"} Oct 09 08:15:38 crc kubenswrapper[4715]: I1009 08:15:38.793172 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" podStartSLOduration=1.143878889 podStartE2EDuration="1.793151216s" podCreationTimestamp="2025-10-09 08:15:37 +0000 UTC" firstStartedPulling="2025-10-09 08:15:37.844730878 +0000 UTC m=+1768.537534886" lastFinishedPulling="2025-10-09 08:15:38.494003205 +0000 UTC m=+1769.186807213" observedRunningTime="2025-10-09 08:15:38.780452834 +0000 UTC m=+1769.473256862" watchObservedRunningTime="2025-10-09 08:15:38.793151216 +0000 UTC m=+1769.485955224" Oct 09 08:15:42 crc kubenswrapper[4715]: I1009 08:15:42.137717 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:15:42 crc kubenswrapper[4715]: E1009 08:15:42.138500 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:15:56 crc kubenswrapper[4715]: I1009 08:15:56.137378 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:15:56 crc kubenswrapper[4715]: E1009 08:15:56.138312 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:16:07 crc kubenswrapper[4715]: I1009 08:16:07.137643 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:16:07 crc kubenswrapper[4715]: E1009 08:16:07.138360 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:16:13 crc kubenswrapper[4715]: I1009 08:16:13.570183 4715 scope.go:117] "RemoveContainer" containerID="8b9912c6ccfe6d6703f155d3ed91a94aa1ea0876f4422c1628c663a4cb20ff11" Oct 09 08:16:13 crc kubenswrapper[4715]: I1009 08:16:13.622584 4715 scope.go:117] "RemoveContainer" containerID="38e59a25e7417c90e107247d7af630d72c110fe1aebdebd65edc704ec9d7a5ba" Oct 09 08:16:13 crc kubenswrapper[4715]: I1009 08:16:13.751265 4715 scope.go:117] "RemoveContainer" containerID="d6866cc7a0c40e97ff268190f7d249a154e01b5a63f5ef7e0c4ebd812f1cff2f" Oct 09 08:16:21 crc kubenswrapper[4715]: I1009 08:16:21.137945 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:16:21 crc kubenswrapper[4715]: E1009 08:16:21.138916 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:16:21 crc kubenswrapper[4715]: I1009 08:16:21.160858 4715 generic.go:334] "Generic (PLEG): container finished" podID="6d2dbb06-3154-428b-991c-09b567c9136e" containerID="02b9173251bc4ea5b059d09d9ad88384a06144b60e3465d84519f3d6ac06ba04" exitCode=0 Oct 09 08:16:21 crc kubenswrapper[4715]: I1009 08:16:21.160930 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" event={"ID":"6d2dbb06-3154-428b-991c-09b567c9136e","Type":"ContainerDied","Data":"02b9173251bc4ea5b059d09d9ad88384a06144b60e3465d84519f3d6ac06ba04"} Oct 09 08:16:22 crc kubenswrapper[4715]: I1009 08:16:22.620686 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:16:22 crc kubenswrapper[4715]: I1009 08:16:22.775826 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9sxp\" (UniqueName: \"kubernetes.io/projected/6d2dbb06-3154-428b-991c-09b567c9136e-kube-api-access-f9sxp\") pod \"6d2dbb06-3154-428b-991c-09b567c9136e\" (UID: \"6d2dbb06-3154-428b-991c-09b567c9136e\") " Oct 09 08:16:22 crc kubenswrapper[4715]: I1009 08:16:22.776047 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d2dbb06-3154-428b-991c-09b567c9136e-inventory\") pod \"6d2dbb06-3154-428b-991c-09b567c9136e\" (UID: \"6d2dbb06-3154-428b-991c-09b567c9136e\") " Oct 09 08:16:22 crc kubenswrapper[4715]: I1009 08:16:22.776076 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d2dbb06-3154-428b-991c-09b567c9136e-ssh-key\") pod \"6d2dbb06-3154-428b-991c-09b567c9136e\" (UID: \"6d2dbb06-3154-428b-991c-09b567c9136e\") " Oct 09 08:16:22 crc kubenswrapper[4715]: I1009 08:16:22.783352 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d2dbb06-3154-428b-991c-09b567c9136e-kube-api-access-f9sxp" (OuterVolumeSpecName: "kube-api-access-f9sxp") pod "6d2dbb06-3154-428b-991c-09b567c9136e" (UID: "6d2dbb06-3154-428b-991c-09b567c9136e"). InnerVolumeSpecName "kube-api-access-f9sxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:16:22 crc kubenswrapper[4715]: I1009 08:16:22.811176 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d2dbb06-3154-428b-991c-09b567c9136e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6d2dbb06-3154-428b-991c-09b567c9136e" (UID: "6d2dbb06-3154-428b-991c-09b567c9136e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:16:22 crc kubenswrapper[4715]: I1009 08:16:22.826859 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d2dbb06-3154-428b-991c-09b567c9136e-inventory" (OuterVolumeSpecName: "inventory") pod "6d2dbb06-3154-428b-991c-09b567c9136e" (UID: "6d2dbb06-3154-428b-991c-09b567c9136e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:16:22 crc kubenswrapper[4715]: I1009 08:16:22.878612 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d2dbb06-3154-428b-991c-09b567c9136e-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:16:22 crc kubenswrapper[4715]: I1009 08:16:22.878836 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6d2dbb06-3154-428b-991c-09b567c9136e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:16:22 crc kubenswrapper[4715]: I1009 08:16:22.878845 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9sxp\" (UniqueName: \"kubernetes.io/projected/6d2dbb06-3154-428b-991c-09b567c9136e-kube-api-access-f9sxp\") on node \"crc\" DevicePath \"\"" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.044967 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-lwr98"] Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.055254 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-lwr98"] Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.190303 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" event={"ID":"6d2dbb06-3154-428b-991c-09b567c9136e","Type":"ContainerDied","Data":"2e8be081fda357247d424a41701e14c8ac0582b932e56e756fba1bfeaf0379c2"} Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.190369 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e8be081fda357247d424a41701e14c8ac0582b932e56e756fba1bfeaf0379c2" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.190519 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.281145 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dpkqg"] Oct 09 08:16:23 crc kubenswrapper[4715]: E1009 08:16:23.281980 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2dbb06-3154-428b-991c-09b567c9136e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.281999 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2dbb06-3154-428b-991c-09b567c9136e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.282263 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d2dbb06-3154-428b-991c-09b567c9136e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.283063 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.285350 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.285383 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.285612 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.287667 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.291349 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dpkqg"] Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.388225 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0986c120-0684-4916-b753-677c2d3e6798-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dpkqg\" (UID: \"0986c120-0684-4916-b753-677c2d3e6798\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.388277 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0986c120-0684-4916-b753-677c2d3e6798-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dpkqg\" (UID: \"0986c120-0684-4916-b753-677c2d3e6798\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.388341 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxcpn\" (UniqueName: \"kubernetes.io/projected/0986c120-0684-4916-b753-677c2d3e6798-kube-api-access-vxcpn\") pod \"ssh-known-hosts-edpm-deployment-dpkqg\" (UID: \"0986c120-0684-4916-b753-677c2d3e6798\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.490594 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0986c120-0684-4916-b753-677c2d3e6798-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dpkqg\" (UID: \"0986c120-0684-4916-b753-677c2d3e6798\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.490675 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0986c120-0684-4916-b753-677c2d3e6798-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dpkqg\" (UID: \"0986c120-0684-4916-b753-677c2d3e6798\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.490746 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxcpn\" (UniqueName: \"kubernetes.io/projected/0986c120-0684-4916-b753-677c2d3e6798-kube-api-access-vxcpn\") pod \"ssh-known-hosts-edpm-deployment-dpkqg\" (UID: \"0986c120-0684-4916-b753-677c2d3e6798\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.494531 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0986c120-0684-4916-b753-677c2d3e6798-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-dpkqg\" (UID: \"0986c120-0684-4916-b753-677c2d3e6798\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.494965 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0986c120-0684-4916-b753-677c2d3e6798-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-dpkqg\" (UID: \"0986c120-0684-4916-b753-677c2d3e6798\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.506484 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxcpn\" (UniqueName: \"kubernetes.io/projected/0986c120-0684-4916-b753-677c2d3e6798-kube-api-access-vxcpn\") pod \"ssh-known-hosts-edpm-deployment-dpkqg\" (UID: \"0986c120-0684-4916-b753-677c2d3e6798\") " pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:23 crc kubenswrapper[4715]: I1009 08:16:23.599688 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:24 crc kubenswrapper[4715]: I1009 08:16:24.150000 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c64d743-936e-460c-87d8-d0aea119fc3c" path="/var/lib/kubelet/pods/8c64d743-936e-460c-87d8-d0aea119fc3c/volumes" Oct 09 08:16:24 crc kubenswrapper[4715]: I1009 08:16:24.161657 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-dpkqg"] Oct 09 08:16:24 crc kubenswrapper[4715]: I1009 08:16:24.202595 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" event={"ID":"0986c120-0684-4916-b753-677c2d3e6798","Type":"ContainerStarted","Data":"5fc8de39fa443d337fdf29bcab05b537fc905951dd84e13265cb37583493e0ca"} Oct 09 08:16:25 crc kubenswrapper[4715]: I1009 08:16:25.211700 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" event={"ID":"0986c120-0684-4916-b753-677c2d3e6798","Type":"ContainerStarted","Data":"d8a32e64673adbca3b4535edf3f4068c0451f345950101626e77e23b582f0fb5"} Oct 09 08:16:25 crc kubenswrapper[4715]: I1009 08:16:25.237004 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" podStartSLOduration=1.805147469 podStartE2EDuration="2.236980169s" podCreationTimestamp="2025-10-09 08:16:23 +0000 UTC" firstStartedPulling="2025-10-09 08:16:24.168309587 +0000 UTC m=+1814.861113605" lastFinishedPulling="2025-10-09 08:16:24.600142297 +0000 UTC m=+1815.292946305" observedRunningTime="2025-10-09 08:16:25.231923765 +0000 UTC m=+1815.924727773" watchObservedRunningTime="2025-10-09 08:16:25.236980169 +0000 UTC m=+1815.929784187" Oct 09 08:16:31 crc kubenswrapper[4715]: I1009 08:16:31.269614 4715 generic.go:334] "Generic (PLEG): container finished" podID="0986c120-0684-4916-b753-677c2d3e6798" containerID="d8a32e64673adbca3b4535edf3f4068c0451f345950101626e77e23b582f0fb5" exitCode=0 Oct 09 08:16:31 crc kubenswrapper[4715]: I1009 08:16:31.269787 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" event={"ID":"0986c120-0684-4916-b753-677c2d3e6798","Type":"ContainerDied","Data":"d8a32e64673adbca3b4535edf3f4068c0451f345950101626e77e23b582f0fb5"} Oct 09 08:16:32 crc kubenswrapper[4715]: I1009 08:16:32.640544 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:32 crc kubenswrapper[4715]: I1009 08:16:32.786228 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxcpn\" (UniqueName: \"kubernetes.io/projected/0986c120-0684-4916-b753-677c2d3e6798-kube-api-access-vxcpn\") pod \"0986c120-0684-4916-b753-677c2d3e6798\" (UID: \"0986c120-0684-4916-b753-677c2d3e6798\") " Oct 09 08:16:32 crc kubenswrapper[4715]: I1009 08:16:32.786398 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0986c120-0684-4916-b753-677c2d3e6798-ssh-key-openstack-edpm-ipam\") pod \"0986c120-0684-4916-b753-677c2d3e6798\" (UID: \"0986c120-0684-4916-b753-677c2d3e6798\") " Oct 09 08:16:32 crc kubenswrapper[4715]: I1009 08:16:32.786515 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0986c120-0684-4916-b753-677c2d3e6798-inventory-0\") pod \"0986c120-0684-4916-b753-677c2d3e6798\" (UID: \"0986c120-0684-4916-b753-677c2d3e6798\") " Oct 09 08:16:32 crc kubenswrapper[4715]: I1009 08:16:32.793381 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0986c120-0684-4916-b753-677c2d3e6798-kube-api-access-vxcpn" (OuterVolumeSpecName: "kube-api-access-vxcpn") pod "0986c120-0684-4916-b753-677c2d3e6798" (UID: "0986c120-0684-4916-b753-677c2d3e6798"). InnerVolumeSpecName "kube-api-access-vxcpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:16:32 crc kubenswrapper[4715]: I1009 08:16:32.816765 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0986c120-0684-4916-b753-677c2d3e6798-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0986c120-0684-4916-b753-677c2d3e6798" (UID: "0986c120-0684-4916-b753-677c2d3e6798"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:16:32 crc kubenswrapper[4715]: I1009 08:16:32.818686 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0986c120-0684-4916-b753-677c2d3e6798-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "0986c120-0684-4916-b753-677c2d3e6798" (UID: "0986c120-0684-4916-b753-677c2d3e6798"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:16:32 crc kubenswrapper[4715]: I1009 08:16:32.889172 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0986c120-0684-4916-b753-677c2d3e6798-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 09 08:16:32 crc kubenswrapper[4715]: I1009 08:16:32.889210 4715 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0986c120-0684-4916-b753-677c2d3e6798-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:16:32 crc kubenswrapper[4715]: I1009 08:16:32.889222 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxcpn\" (UniqueName: \"kubernetes.io/projected/0986c120-0684-4916-b753-677c2d3e6798-kube-api-access-vxcpn\") on node \"crc\" DevicePath \"\"" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.289154 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" event={"ID":"0986c120-0684-4916-b753-677c2d3e6798","Type":"ContainerDied","Data":"5fc8de39fa443d337fdf29bcab05b537fc905951dd84e13265cb37583493e0ca"} Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.289211 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fc8de39fa443d337fdf29bcab05b537fc905951dd84e13265cb37583493e0ca" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.289220 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-dpkqg" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.362917 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65"] Oct 09 08:16:33 crc kubenswrapper[4715]: E1009 08:16:33.363302 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0986c120-0684-4916-b753-677c2d3e6798" containerName="ssh-known-hosts-edpm-deployment" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.363328 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="0986c120-0684-4916-b753-677c2d3e6798" containerName="ssh-known-hosts-edpm-deployment" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.363522 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="0986c120-0684-4916-b753-677c2d3e6798" containerName="ssh-known-hosts-edpm-deployment" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.364142 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.373135 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.373156 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.373250 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.373497 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.373945 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65"] Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.499685 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/806359bf-f132-4db6-9795-4e180be1895a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qhl65\" (UID: \"806359bf-f132-4db6-9795-4e180be1895a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.500099 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/806359bf-f132-4db6-9795-4e180be1895a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qhl65\" (UID: \"806359bf-f132-4db6-9795-4e180be1895a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.500224 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnbj9\" (UniqueName: \"kubernetes.io/projected/806359bf-f132-4db6-9795-4e180be1895a-kube-api-access-xnbj9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qhl65\" (UID: \"806359bf-f132-4db6-9795-4e180be1895a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.602505 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/806359bf-f132-4db6-9795-4e180be1895a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qhl65\" (UID: \"806359bf-f132-4db6-9795-4e180be1895a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.602655 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/806359bf-f132-4db6-9795-4e180be1895a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qhl65\" (UID: \"806359bf-f132-4db6-9795-4e180be1895a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.602689 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnbj9\" (UniqueName: \"kubernetes.io/projected/806359bf-f132-4db6-9795-4e180be1895a-kube-api-access-xnbj9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qhl65\" (UID: \"806359bf-f132-4db6-9795-4e180be1895a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.607357 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/806359bf-f132-4db6-9795-4e180be1895a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qhl65\" (UID: \"806359bf-f132-4db6-9795-4e180be1895a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.607394 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/806359bf-f132-4db6-9795-4e180be1895a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qhl65\" (UID: \"806359bf-f132-4db6-9795-4e180be1895a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.620632 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnbj9\" (UniqueName: \"kubernetes.io/projected/806359bf-f132-4db6-9795-4e180be1895a-kube-api-access-xnbj9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-qhl65\" (UID: \"806359bf-f132-4db6-9795-4e180be1895a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:33 crc kubenswrapper[4715]: I1009 08:16:33.682617 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:34 crc kubenswrapper[4715]: I1009 08:16:34.207981 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65"] Oct 09 08:16:34 crc kubenswrapper[4715]: I1009 08:16:34.297908 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" event={"ID":"806359bf-f132-4db6-9795-4e180be1895a","Type":"ContainerStarted","Data":"074344e6504baae635405aee71ee1732860d1b8e9981ec551715a5afa56f028b"} Oct 09 08:16:35 crc kubenswrapper[4715]: I1009 08:16:35.310997 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" event={"ID":"806359bf-f132-4db6-9795-4e180be1895a","Type":"ContainerStarted","Data":"8cba4d0da8fcae5a00141a3efcfd2c2af9cbb6d65095d3f96c6875ac4e18539e"} Oct 09 08:16:35 crc kubenswrapper[4715]: I1009 08:16:35.343692 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" podStartSLOduration=1.706433892 podStartE2EDuration="2.343661316s" podCreationTimestamp="2025-10-09 08:16:33 +0000 UTC" firstStartedPulling="2025-10-09 08:16:34.214504396 +0000 UTC m=+1824.907308404" lastFinishedPulling="2025-10-09 08:16:34.85173182 +0000 UTC m=+1825.544535828" observedRunningTime="2025-10-09 08:16:35.342841353 +0000 UTC m=+1826.035645391" watchObservedRunningTime="2025-10-09 08:16:35.343661316 +0000 UTC m=+1826.036465324" Oct 09 08:16:36 crc kubenswrapper[4715]: I1009 08:16:36.136986 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:16:36 crc kubenswrapper[4715]: E1009 08:16:36.137580 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:16:43 crc kubenswrapper[4715]: I1009 08:16:43.382199 4715 generic.go:334] "Generic (PLEG): container finished" podID="806359bf-f132-4db6-9795-4e180be1895a" containerID="8cba4d0da8fcae5a00141a3efcfd2c2af9cbb6d65095d3f96c6875ac4e18539e" exitCode=0 Oct 09 08:16:43 crc kubenswrapper[4715]: I1009 08:16:43.382334 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" event={"ID":"806359bf-f132-4db6-9795-4e180be1895a","Type":"ContainerDied","Data":"8cba4d0da8fcae5a00141a3efcfd2c2af9cbb6d65095d3f96c6875ac4e18539e"} Oct 09 08:16:44 crc kubenswrapper[4715]: I1009 08:16:44.792892 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:44 crc kubenswrapper[4715]: I1009 08:16:44.928787 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnbj9\" (UniqueName: \"kubernetes.io/projected/806359bf-f132-4db6-9795-4e180be1895a-kube-api-access-xnbj9\") pod \"806359bf-f132-4db6-9795-4e180be1895a\" (UID: \"806359bf-f132-4db6-9795-4e180be1895a\") " Oct 09 08:16:44 crc kubenswrapper[4715]: I1009 08:16:44.928851 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/806359bf-f132-4db6-9795-4e180be1895a-ssh-key\") pod \"806359bf-f132-4db6-9795-4e180be1895a\" (UID: \"806359bf-f132-4db6-9795-4e180be1895a\") " Oct 09 08:16:44 crc kubenswrapper[4715]: I1009 08:16:44.929025 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/806359bf-f132-4db6-9795-4e180be1895a-inventory\") pod \"806359bf-f132-4db6-9795-4e180be1895a\" (UID: \"806359bf-f132-4db6-9795-4e180be1895a\") " Oct 09 08:16:44 crc kubenswrapper[4715]: I1009 08:16:44.941664 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/806359bf-f132-4db6-9795-4e180be1895a-kube-api-access-xnbj9" (OuterVolumeSpecName: "kube-api-access-xnbj9") pod "806359bf-f132-4db6-9795-4e180be1895a" (UID: "806359bf-f132-4db6-9795-4e180be1895a"). InnerVolumeSpecName "kube-api-access-xnbj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:16:44 crc kubenswrapper[4715]: I1009 08:16:44.960648 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/806359bf-f132-4db6-9795-4e180be1895a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "806359bf-f132-4db6-9795-4e180be1895a" (UID: "806359bf-f132-4db6-9795-4e180be1895a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:16:44 crc kubenswrapper[4715]: I1009 08:16:44.967218 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/806359bf-f132-4db6-9795-4e180be1895a-inventory" (OuterVolumeSpecName: "inventory") pod "806359bf-f132-4db6-9795-4e180be1895a" (UID: "806359bf-f132-4db6-9795-4e180be1895a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.031126 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/806359bf-f132-4db6-9795-4e180be1895a-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.031172 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnbj9\" (UniqueName: \"kubernetes.io/projected/806359bf-f132-4db6-9795-4e180be1895a-kube-api-access-xnbj9\") on node \"crc\" DevicePath \"\"" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.031186 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/806359bf-f132-4db6-9795-4e180be1895a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.404954 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" event={"ID":"806359bf-f132-4db6-9795-4e180be1895a","Type":"ContainerDied","Data":"074344e6504baae635405aee71ee1732860d1b8e9981ec551715a5afa56f028b"} Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.405012 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="074344e6504baae635405aee71ee1732860d1b8e9981ec551715a5afa56f028b" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.405093 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-qhl65" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.532019 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc"] Oct 09 08:16:45 crc kubenswrapper[4715]: E1009 08:16:45.532703 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806359bf-f132-4db6-9795-4e180be1895a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.532734 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="806359bf-f132-4db6-9795-4e180be1895a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.532970 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="806359bf-f132-4db6-9795-4e180be1895a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.533848 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.536739 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.537913 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.539089 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.540690 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0ea0eb1-5091-4178-8ae8-a39cf494915d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc\" (UID: \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.540746 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b88jf\" (UniqueName: \"kubernetes.io/projected/f0ea0eb1-5091-4178-8ae8-a39cf494915d-kube-api-access-b88jf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc\" (UID: \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.540812 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0ea0eb1-5091-4178-8ae8-a39cf494915d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc\" (UID: \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.543641 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.561728 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc"] Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.641985 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0ea0eb1-5091-4178-8ae8-a39cf494915d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc\" (UID: \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.642046 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b88jf\" (UniqueName: \"kubernetes.io/projected/f0ea0eb1-5091-4178-8ae8-a39cf494915d-kube-api-access-b88jf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc\" (UID: \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.642123 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0ea0eb1-5091-4178-8ae8-a39cf494915d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc\" (UID: \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.646846 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0ea0eb1-5091-4178-8ae8-a39cf494915d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc\" (UID: \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:16:45 crc kubenswrapper[4715]: E1009 08:16:45.652750 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod806359bf_f132_4db6_9795_4e180be1895a.slice/crio-074344e6504baae635405aee71ee1732860d1b8e9981ec551715a5afa56f028b\": RecentStats: unable to find data in memory cache]" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.663584 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0ea0eb1-5091-4178-8ae8-a39cf494915d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc\" (UID: \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.664074 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b88jf\" (UniqueName: \"kubernetes.io/projected/f0ea0eb1-5091-4178-8ae8-a39cf494915d-kube-api-access-b88jf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc\" (UID: \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:16:45 crc kubenswrapper[4715]: I1009 08:16:45.872704 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:16:46 crc kubenswrapper[4715]: I1009 08:16:46.396236 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc"] Oct 09 08:16:46 crc kubenswrapper[4715]: I1009 08:16:46.414750 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" event={"ID":"f0ea0eb1-5091-4178-8ae8-a39cf494915d","Type":"ContainerStarted","Data":"59d77713e6acf124cb7e2e030f61a374ecb10b82cb7520608a2f2b6ee565451f"} Oct 09 08:16:47 crc kubenswrapper[4715]: I1009 08:16:47.424547 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" event={"ID":"f0ea0eb1-5091-4178-8ae8-a39cf494915d","Type":"ContainerStarted","Data":"0af32d570c80eb7423c8ab62a4591641530283ccec3bc597eec9abd52be8a221"} Oct 09 08:16:47 crc kubenswrapper[4715]: I1009 08:16:47.453934 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" podStartSLOduration=1.9759269590000001 podStartE2EDuration="2.452401853s" podCreationTimestamp="2025-10-09 08:16:45 +0000 UTC" firstStartedPulling="2025-10-09 08:16:46.40331212 +0000 UTC m=+1837.096116138" lastFinishedPulling="2025-10-09 08:16:46.879787024 +0000 UTC m=+1837.572591032" observedRunningTime="2025-10-09 08:16:47.446758012 +0000 UTC m=+1838.139562020" watchObservedRunningTime="2025-10-09 08:16:47.452401853 +0000 UTC m=+1838.145205871" Oct 09 08:16:50 crc kubenswrapper[4715]: I1009 08:16:50.146067 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:16:50 crc kubenswrapper[4715]: E1009 08:16:50.147185 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:17:05 crc kubenswrapper[4715]: I1009 08:17:05.136563 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:17:05 crc kubenswrapper[4715]: E1009 08:17:05.137495 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:17:13 crc kubenswrapper[4715]: I1009 08:17:13.937247 4715 scope.go:117] "RemoveContainer" containerID="67d55c17eb96658a6efaaf2f7731c75bd85723bfee29bd352cdfe0b8a72ce9ff" Oct 09 08:17:17 crc kubenswrapper[4715]: I1009 08:17:17.137626 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:17:17 crc kubenswrapper[4715]: E1009 08:17:17.138836 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:17:31 crc kubenswrapper[4715]: I1009 08:17:31.137129 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:17:31 crc kubenswrapper[4715]: E1009 08:17:31.138338 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:17:42 crc kubenswrapper[4715]: I1009 08:17:42.136632 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:17:42 crc kubenswrapper[4715]: E1009 08:17:42.138891 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:17:55 crc kubenswrapper[4715]: I1009 08:17:55.137539 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:17:55 crc kubenswrapper[4715]: E1009 08:17:55.139640 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:18:02 crc kubenswrapper[4715]: I1009 08:18:02.109023 4715 generic.go:334] "Generic (PLEG): container finished" podID="f0ea0eb1-5091-4178-8ae8-a39cf494915d" containerID="0af32d570c80eb7423c8ab62a4591641530283ccec3bc597eec9abd52be8a221" exitCode=0 Oct 09 08:18:02 crc kubenswrapper[4715]: I1009 08:18:02.109064 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" event={"ID":"f0ea0eb1-5091-4178-8ae8-a39cf494915d","Type":"ContainerDied","Data":"0af32d570c80eb7423c8ab62a4591641530283ccec3bc597eec9abd52be8a221"} Oct 09 08:18:03 crc kubenswrapper[4715]: I1009 08:18:03.502614 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:18:03 crc kubenswrapper[4715]: I1009 08:18:03.648855 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0ea0eb1-5091-4178-8ae8-a39cf494915d-ssh-key\") pod \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\" (UID: \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\") " Oct 09 08:18:03 crc kubenswrapper[4715]: I1009 08:18:03.648942 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0ea0eb1-5091-4178-8ae8-a39cf494915d-inventory\") pod \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\" (UID: \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\") " Oct 09 08:18:03 crc kubenswrapper[4715]: I1009 08:18:03.649072 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b88jf\" (UniqueName: \"kubernetes.io/projected/f0ea0eb1-5091-4178-8ae8-a39cf494915d-kube-api-access-b88jf\") pod \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\" (UID: \"f0ea0eb1-5091-4178-8ae8-a39cf494915d\") " Oct 09 08:18:03 crc kubenswrapper[4715]: I1009 08:18:03.655869 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ea0eb1-5091-4178-8ae8-a39cf494915d-kube-api-access-b88jf" (OuterVolumeSpecName: "kube-api-access-b88jf") pod "f0ea0eb1-5091-4178-8ae8-a39cf494915d" (UID: "f0ea0eb1-5091-4178-8ae8-a39cf494915d"). InnerVolumeSpecName "kube-api-access-b88jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:18:03 crc kubenswrapper[4715]: I1009 08:18:03.680169 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ea0eb1-5091-4178-8ae8-a39cf494915d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f0ea0eb1-5091-4178-8ae8-a39cf494915d" (UID: "f0ea0eb1-5091-4178-8ae8-a39cf494915d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:18:03 crc kubenswrapper[4715]: I1009 08:18:03.705889 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ea0eb1-5091-4178-8ae8-a39cf494915d-inventory" (OuterVolumeSpecName: "inventory") pod "f0ea0eb1-5091-4178-8ae8-a39cf494915d" (UID: "f0ea0eb1-5091-4178-8ae8-a39cf494915d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:18:03 crc kubenswrapper[4715]: I1009 08:18:03.757177 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b88jf\" (UniqueName: \"kubernetes.io/projected/f0ea0eb1-5091-4178-8ae8-a39cf494915d-kube-api-access-b88jf\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:03 crc kubenswrapper[4715]: I1009 08:18:03.757219 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0ea0eb1-5091-4178-8ae8-a39cf494915d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:03 crc kubenswrapper[4715]: I1009 08:18:03.757228 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0ea0eb1-5091-4178-8ae8-a39cf494915d-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.131146 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" event={"ID":"f0ea0eb1-5091-4178-8ae8-a39cf494915d","Type":"ContainerDied","Data":"59d77713e6acf124cb7e2e030f61a374ecb10b82cb7520608a2f2b6ee565451f"} Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.131224 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59d77713e6acf124cb7e2e030f61a374ecb10b82cb7520608a2f2b6ee565451f" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.131279 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.259791 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz"] Oct 09 08:18:04 crc kubenswrapper[4715]: E1009 08:18:04.260145 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ea0eb1-5091-4178-8ae8-a39cf494915d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.260163 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ea0eb1-5091-4178-8ae8-a39cf494915d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.260381 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ea0eb1-5091-4178-8ae8-a39cf494915d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.261105 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.268707 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.268984 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.269164 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.269307 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.269823 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.269952 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.270065 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.270169 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.272064 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz"] Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368188 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368254 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368368 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368391 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368440 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368473 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368538 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368604 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368688 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368710 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368732 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368754 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368773 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgglb\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-kube-api-access-fgglb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.368798 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.470590 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.470642 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.470679 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.470703 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.470744 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.470784 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.470842 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.470875 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.470943 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.470972 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.470996 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.471027 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.471052 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgglb\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-kube-api-access-fgglb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.471085 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.476251 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.476383 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.476558 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.476635 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.477018 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.477517 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.477735 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.478204 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.479474 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.479586 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.482239 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.482238 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.492706 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.493255 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgglb\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-kube-api-access-fgglb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:04 crc kubenswrapper[4715]: I1009 08:18:04.589093 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:05 crc kubenswrapper[4715]: I1009 08:18:05.095250 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz"] Oct 09 08:18:05 crc kubenswrapper[4715]: I1009 08:18:05.158602 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" event={"ID":"34e4a21c-b0d6-448f-9fb9-42f65187fad8","Type":"ContainerStarted","Data":"7941b874cbae4e56bd5c2b12d7523521131591904f12e5f903173cb7aafbce77"} Oct 09 08:18:06 crc kubenswrapper[4715]: I1009 08:18:06.168795 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" event={"ID":"34e4a21c-b0d6-448f-9fb9-42f65187fad8","Type":"ContainerStarted","Data":"4531196f323c9cf34c763aab886cc66f8bde779bcbfd390de4c1743abf9916dc"} Oct 09 08:18:06 crc kubenswrapper[4715]: I1009 08:18:06.192450 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" podStartSLOduration=1.7619692200000001 podStartE2EDuration="2.192411337s" podCreationTimestamp="2025-10-09 08:18:04 +0000 UTC" firstStartedPulling="2025-10-09 08:18:05.100888266 +0000 UTC m=+1915.793692274" lastFinishedPulling="2025-10-09 08:18:05.531330383 +0000 UTC m=+1916.224134391" observedRunningTime="2025-10-09 08:18:06.190600905 +0000 UTC m=+1916.883404913" watchObservedRunningTime="2025-10-09 08:18:06.192411337 +0000 UTC m=+1916.885215355" Oct 09 08:18:08 crc kubenswrapper[4715]: I1009 08:18:08.136649 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:18:08 crc kubenswrapper[4715]: E1009 08:18:08.137233 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:18:22 crc kubenswrapper[4715]: I1009 08:18:22.137354 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:18:22 crc kubenswrapper[4715]: E1009 08:18:22.138050 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:18:34 crc kubenswrapper[4715]: I1009 08:18:34.137564 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:18:34 crc kubenswrapper[4715]: E1009 08:18:34.140485 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:18:44 crc kubenswrapper[4715]: I1009 08:18:44.498347 4715 generic.go:334] "Generic (PLEG): container finished" podID="34e4a21c-b0d6-448f-9fb9-42f65187fad8" containerID="4531196f323c9cf34c763aab886cc66f8bde779bcbfd390de4c1743abf9916dc" exitCode=0 Oct 09 08:18:44 crc kubenswrapper[4715]: I1009 08:18:44.498484 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" event={"ID":"34e4a21c-b0d6-448f-9fb9-42f65187fad8","Type":"ContainerDied","Data":"4531196f323c9cf34c763aab886cc66f8bde779bcbfd390de4c1743abf9916dc"} Oct 09 08:18:45 crc kubenswrapper[4715]: I1009 08:18:45.927991 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.118781 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.118957 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-neutron-metadata-combined-ca-bundle\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.119045 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-nova-combined-ca-bundle\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.119099 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-inventory\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.119179 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-telemetry-combined-ca-bundle\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.119304 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-libvirt-combined-ca-bundle\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.119410 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.119525 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-ssh-key\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.119577 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgglb\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-kube-api-access-fgglb\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.119642 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.119695 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-repo-setup-combined-ca-bundle\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.119744 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-bootstrap-combined-ca-bundle\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.119797 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.119916 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-ovn-combined-ca-bundle\") pod \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\" (UID: \"34e4a21c-b0d6-448f-9fb9-42f65187fad8\") " Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.125333 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.126413 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.126574 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.127372 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.128623 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.129381 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.130147 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.130620 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.130829 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.130948 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-kube-api-access-fgglb" (OuterVolumeSpecName: "kube-api-access-fgglb") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "kube-api-access-fgglb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.131368 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.133955 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.171915 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-inventory" (OuterVolumeSpecName: "inventory") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.174839 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "34e4a21c-b0d6-448f-9fb9-42f65187fad8" (UID: "34e4a21c-b0d6-448f-9fb9-42f65187fad8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222028 4715 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222072 4715 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222090 4715 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222100 4715 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222112 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222123 4715 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222134 4715 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222144 4715 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222157 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222168 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgglb\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-kube-api-access-fgglb\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222181 4715 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222194 4715 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222206 4715 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34e4a21c-b0d6-448f-9fb9-42f65187fad8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.222218 4715 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/34e4a21c-b0d6-448f-9fb9-42f65187fad8-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.515880 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" event={"ID":"34e4a21c-b0d6-448f-9fb9-42f65187fad8","Type":"ContainerDied","Data":"7941b874cbae4e56bd5c2b12d7523521131591904f12e5f903173cb7aafbce77"} Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.516144 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7941b874cbae4e56bd5c2b12d7523521131591904f12e5f903173cb7aafbce77" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.516208 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.604776 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv"] Oct 09 08:18:46 crc kubenswrapper[4715]: E1009 08:18:46.605222 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e4a21c-b0d6-448f-9fb9-42f65187fad8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.605245 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e4a21c-b0d6-448f-9fb9-42f65187fad8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.605519 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e4a21c-b0d6-448f-9fb9-42f65187fad8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.606252 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.611621 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.611694 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.611850 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.611917 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.612107 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.618770 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv"] Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.731325 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.731380 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a6388a30-80c6-412d-9c3d-9b555b215d76-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.731401 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.732163 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mdqd\" (UniqueName: \"kubernetes.io/projected/a6388a30-80c6-412d-9c3d-9b555b215d76-kube-api-access-4mdqd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.732269 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.833691 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.833818 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.833850 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a6388a30-80c6-412d-9c3d-9b555b215d76-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.833872 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.833968 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mdqd\" (UniqueName: \"kubernetes.io/projected/a6388a30-80c6-412d-9c3d-9b555b215d76-kube-api-access-4mdqd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.835446 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a6388a30-80c6-412d-9c3d-9b555b215d76-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.840383 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.840972 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.844227 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.850268 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mdqd\" (UniqueName: \"kubernetes.io/projected/a6388a30-80c6-412d-9c3d-9b555b215d76-kube-api-access-4mdqd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-czrvv\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:46 crc kubenswrapper[4715]: I1009 08:18:46.931367 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:18:47 crc kubenswrapper[4715]: I1009 08:18:47.450284 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv"] Oct 09 08:18:47 crc kubenswrapper[4715]: W1009 08:18:47.453342 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6388a30_80c6_412d_9c3d_9b555b215d76.slice/crio-e1532ba4c7c61a7b4abd6d91e9dbf61c731a0d1a87ef6ae488165b339e778d07 WatchSource:0}: Error finding container e1532ba4c7c61a7b4abd6d91e9dbf61c731a0d1a87ef6ae488165b339e778d07: Status 404 returned error can't find the container with id e1532ba4c7c61a7b4abd6d91e9dbf61c731a0d1a87ef6ae488165b339e778d07 Oct 09 08:18:47 crc kubenswrapper[4715]: I1009 08:18:47.456607 4715 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 08:18:47 crc kubenswrapper[4715]: I1009 08:18:47.526535 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" event={"ID":"a6388a30-80c6-412d-9c3d-9b555b215d76","Type":"ContainerStarted","Data":"e1532ba4c7c61a7b4abd6d91e9dbf61c731a0d1a87ef6ae488165b339e778d07"} Oct 09 08:18:48 crc kubenswrapper[4715]: I1009 08:18:48.552744 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" event={"ID":"a6388a30-80c6-412d-9c3d-9b555b215d76","Type":"ContainerStarted","Data":"05f5ac44eb3bf7311a3454c59b2da8e7e11c3ca3fd1fe435235ab9dccf9006aa"} Oct 09 08:18:48 crc kubenswrapper[4715]: I1009 08:18:48.579804 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" podStartSLOduration=2.032191463 podStartE2EDuration="2.5797856s" podCreationTimestamp="2025-10-09 08:18:46 +0000 UTC" firstStartedPulling="2025-10-09 08:18:47.456338985 +0000 UTC m=+1958.149142993" lastFinishedPulling="2025-10-09 08:18:48.003933122 +0000 UTC m=+1958.696737130" observedRunningTime="2025-10-09 08:18:48.57315925 +0000 UTC m=+1959.265963268" watchObservedRunningTime="2025-10-09 08:18:48.5797856 +0000 UTC m=+1959.272589608" Oct 09 08:18:49 crc kubenswrapper[4715]: I1009 08:18:49.137719 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:18:49 crc kubenswrapper[4715]: I1009 08:18:49.564205 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"015af87f013fa22d235ca7ed867a3852e276c3f820b05ec66959a6cef5662490"} Oct 09 08:19:51 crc kubenswrapper[4715]: I1009 08:19:51.112460 4715 generic.go:334] "Generic (PLEG): container finished" podID="a6388a30-80c6-412d-9c3d-9b555b215d76" containerID="05f5ac44eb3bf7311a3454c59b2da8e7e11c3ca3fd1fe435235ab9dccf9006aa" exitCode=0 Oct 09 08:19:51 crc kubenswrapper[4715]: I1009 08:19:51.112531 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" event={"ID":"a6388a30-80c6-412d-9c3d-9b555b215d76","Type":"ContainerDied","Data":"05f5ac44eb3bf7311a3454c59b2da8e7e11c3ca3fd1fe435235ab9dccf9006aa"} Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.536467 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.575995 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-inventory\") pod \"a6388a30-80c6-412d-9c3d-9b555b215d76\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.604571 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-inventory" (OuterVolumeSpecName: "inventory") pod "a6388a30-80c6-412d-9c3d-9b555b215d76" (UID: "a6388a30-80c6-412d-9c3d-9b555b215d76"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.677778 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a6388a30-80c6-412d-9c3d-9b555b215d76-ovncontroller-config-0\") pod \"a6388a30-80c6-412d-9c3d-9b555b215d76\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.677817 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-ovn-combined-ca-bundle\") pod \"a6388a30-80c6-412d-9c3d-9b555b215d76\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.677836 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-ssh-key\") pod \"a6388a30-80c6-412d-9c3d-9b555b215d76\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.677900 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mdqd\" (UniqueName: \"kubernetes.io/projected/a6388a30-80c6-412d-9c3d-9b555b215d76-kube-api-access-4mdqd\") pod \"a6388a30-80c6-412d-9c3d-9b555b215d76\" (UID: \"a6388a30-80c6-412d-9c3d-9b555b215d76\") " Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.678281 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.681676 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a6388a30-80c6-412d-9c3d-9b555b215d76" (UID: "a6388a30-80c6-412d-9c3d-9b555b215d76"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.682402 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6388a30-80c6-412d-9c3d-9b555b215d76-kube-api-access-4mdqd" (OuterVolumeSpecName: "kube-api-access-4mdqd") pod "a6388a30-80c6-412d-9c3d-9b555b215d76" (UID: "a6388a30-80c6-412d-9c3d-9b555b215d76"). InnerVolumeSpecName "kube-api-access-4mdqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.700272 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6388a30-80c6-412d-9c3d-9b555b215d76-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a6388a30-80c6-412d-9c3d-9b555b215d76" (UID: "a6388a30-80c6-412d-9c3d-9b555b215d76"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.715147 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a6388a30-80c6-412d-9c3d-9b555b215d76" (UID: "a6388a30-80c6-412d-9c3d-9b555b215d76"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.780635 4715 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a6388a30-80c6-412d-9c3d-9b555b215d76-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.780702 4715 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.780729 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6388a30-80c6-412d-9c3d-9b555b215d76-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:19:52 crc kubenswrapper[4715]: I1009 08:19:52.780753 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mdqd\" (UniqueName: \"kubernetes.io/projected/a6388a30-80c6-412d-9c3d-9b555b215d76-kube-api-access-4mdqd\") on node \"crc\" DevicePath \"\"" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.135947 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.135940 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-czrvv" event={"ID":"a6388a30-80c6-412d-9c3d-9b555b215d76","Type":"ContainerDied","Data":"e1532ba4c7c61a7b4abd6d91e9dbf61c731a0d1a87ef6ae488165b339e778d07"} Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.136010 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1532ba4c7c61a7b4abd6d91e9dbf61c731a0d1a87ef6ae488165b339e778d07" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.222918 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq"] Oct 09 08:19:53 crc kubenswrapper[4715]: E1009 08:19:53.223281 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6388a30-80c6-412d-9c3d-9b555b215d76" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.223302 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6388a30-80c6-412d-9c3d-9b555b215d76" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.223579 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6388a30-80c6-412d-9c3d-9b555b215d76" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.224359 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.232285 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.232521 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.232625 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.232723 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.232829 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.233111 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.242028 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq"] Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.289655 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.289705 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.289735 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.290007 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqtmh\" (UniqueName: \"kubernetes.io/projected/1b4c9589-4c0f-4f07-86d8-4573a0e80292-kube-api-access-sqtmh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.290162 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.290207 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.391067 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.391128 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.391227 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.391255 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.391305 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.391389 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqtmh\" (UniqueName: \"kubernetes.io/projected/1b4c9589-4c0f-4f07-86d8-4573a0e80292-kube-api-access-sqtmh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.395700 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.395828 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.396330 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.396446 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.398702 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.416035 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqtmh\" (UniqueName: \"kubernetes.io/projected/1b4c9589-4c0f-4f07-86d8-4573a0e80292-kube-api-access-sqtmh\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:53 crc kubenswrapper[4715]: I1009 08:19:53.549738 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:19:54 crc kubenswrapper[4715]: I1009 08:19:54.058031 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq"] Oct 09 08:19:54 crc kubenswrapper[4715]: I1009 08:19:54.148664 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" event={"ID":"1b4c9589-4c0f-4f07-86d8-4573a0e80292","Type":"ContainerStarted","Data":"9fcc62bca87219599dbf6b1ee22ccb42309ec2d7ce35f72f0666923224636f39"} Oct 09 08:19:56 crc kubenswrapper[4715]: I1009 08:19:56.173607 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" event={"ID":"1b4c9589-4c0f-4f07-86d8-4573a0e80292","Type":"ContainerStarted","Data":"4e29dc0c69186d3448bcf704af4276ee15b682776e8c4e4852316b384ebeb270"} Oct 09 08:19:56 crc kubenswrapper[4715]: I1009 08:19:56.196534 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" podStartSLOduration=2.043108703 podStartE2EDuration="3.196514056s" podCreationTimestamp="2025-10-09 08:19:53 +0000 UTC" firstStartedPulling="2025-10-09 08:19:54.063867793 +0000 UTC m=+2024.756671801" lastFinishedPulling="2025-10-09 08:19:55.217273146 +0000 UTC m=+2025.910077154" observedRunningTime="2025-10-09 08:19:56.190874194 +0000 UTC m=+2026.883678202" watchObservedRunningTime="2025-10-09 08:19:56.196514056 +0000 UTC m=+2026.889318074" Oct 09 08:20:42 crc kubenswrapper[4715]: I1009 08:20:42.579000 4715 generic.go:334] "Generic (PLEG): container finished" podID="1b4c9589-4c0f-4f07-86d8-4573a0e80292" containerID="4e29dc0c69186d3448bcf704af4276ee15b682776e8c4e4852316b384ebeb270" exitCode=0 Oct 09 08:20:42 crc kubenswrapper[4715]: I1009 08:20:42.579115 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" event={"ID":"1b4c9589-4c0f-4f07-86d8-4573a0e80292","Type":"ContainerDied","Data":"4e29dc0c69186d3448bcf704af4276ee15b682776e8c4e4852316b384ebeb270"} Oct 09 08:20:43 crc kubenswrapper[4715]: I1009 08:20:43.981660 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.054857 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqtmh\" (UniqueName: \"kubernetes.io/projected/1b4c9589-4c0f-4f07-86d8-4573a0e80292-kube-api-access-sqtmh\") pod \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.055025 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-ssh-key\") pod \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.055187 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-inventory\") pod \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.055217 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-neutron-metadata-combined-ca-bundle\") pod \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.055243 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-neutron-ovn-metadata-agent-neutron-config-0\") pod \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.055285 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-nova-metadata-neutron-config-0\") pod \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\" (UID: \"1b4c9589-4c0f-4f07-86d8-4573a0e80292\") " Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.061579 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1b4c9589-4c0f-4f07-86d8-4573a0e80292" (UID: "1b4c9589-4c0f-4f07-86d8-4573a0e80292"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.061625 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b4c9589-4c0f-4f07-86d8-4573a0e80292-kube-api-access-sqtmh" (OuterVolumeSpecName: "kube-api-access-sqtmh") pod "1b4c9589-4c0f-4f07-86d8-4573a0e80292" (UID: "1b4c9589-4c0f-4f07-86d8-4573a0e80292"). InnerVolumeSpecName "kube-api-access-sqtmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.082082 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-inventory" (OuterVolumeSpecName: "inventory") pod "1b4c9589-4c0f-4f07-86d8-4573a0e80292" (UID: "1b4c9589-4c0f-4f07-86d8-4573a0e80292"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.084168 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "1b4c9589-4c0f-4f07-86d8-4573a0e80292" (UID: "1b4c9589-4c0f-4f07-86d8-4573a0e80292"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.089186 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "1b4c9589-4c0f-4f07-86d8-4573a0e80292" (UID: "1b4c9589-4c0f-4f07-86d8-4573a0e80292"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.090246 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1b4c9589-4c0f-4f07-86d8-4573a0e80292" (UID: "1b4c9589-4c0f-4f07-86d8-4573a0e80292"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.158856 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.158890 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.158929 4715 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.158946 4715 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.158961 4715 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b4c9589-4c0f-4f07-86d8-4573a0e80292-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.158974 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqtmh\" (UniqueName: \"kubernetes.io/projected/1b4c9589-4c0f-4f07-86d8-4573a0e80292-kube-api-access-sqtmh\") on node \"crc\" DevicePath \"\"" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.597687 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" event={"ID":"1b4c9589-4c0f-4f07-86d8-4573a0e80292","Type":"ContainerDied","Data":"9fcc62bca87219599dbf6b1ee22ccb42309ec2d7ce35f72f0666923224636f39"} Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.597728 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fcc62bca87219599dbf6b1ee22ccb42309ec2d7ce35f72f0666923224636f39" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.597975 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.742967 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64"] Oct 09 08:20:44 crc kubenswrapper[4715]: E1009 08:20:44.743483 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b4c9589-4c0f-4f07-86d8-4573a0e80292" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.743509 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4c9589-4c0f-4f07-86d8-4573a0e80292" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.743733 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b4c9589-4c0f-4f07-86d8-4573a0e80292" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.744484 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.747324 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.748175 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.748202 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.748244 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.748586 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.755894 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64"] Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.768991 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.776805 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z68dt\" (UniqueName: \"kubernetes.io/projected/767ac586-f48e-410c-a5bb-589eccbef2c8-kube-api-access-z68dt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.779229 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.779640 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.780459 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.881996 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.882271 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.882823 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.883018 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z68dt\" (UniqueName: \"kubernetes.io/projected/767ac586-f48e-410c-a5bb-589eccbef2c8-kube-api-access-z68dt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.883214 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.886510 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.886890 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.893049 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.893383 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:44 crc kubenswrapper[4715]: I1009 08:20:44.906476 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z68dt\" (UniqueName: \"kubernetes.io/projected/767ac586-f48e-410c-a5bb-589eccbef2c8-kube-api-access-z68dt\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-whz64\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:45 crc kubenswrapper[4715]: I1009 08:20:45.066189 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:20:45 crc kubenswrapper[4715]: I1009 08:20:45.602289 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64"] Oct 09 08:20:46 crc kubenswrapper[4715]: I1009 08:20:46.617787 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" event={"ID":"767ac586-f48e-410c-a5bb-589eccbef2c8","Type":"ContainerStarted","Data":"c69e342300596bd0764371bb74d9877df29bdfff90d5a0394d013ea5cdd58ce5"} Oct 09 08:20:46 crc kubenswrapper[4715]: I1009 08:20:46.619091 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" event={"ID":"767ac586-f48e-410c-a5bb-589eccbef2c8","Type":"ContainerStarted","Data":"a207504d761a2cfc989318d196d3489acac9c85dec6864c524cc6d57728bd916"} Oct 09 08:20:46 crc kubenswrapper[4715]: I1009 08:20:46.644983 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" podStartSLOduration=2.169462309 podStartE2EDuration="2.644949707s" podCreationTimestamp="2025-10-09 08:20:44 +0000 UTC" firstStartedPulling="2025-10-09 08:20:45.60924798 +0000 UTC m=+2076.302051988" lastFinishedPulling="2025-10-09 08:20:46.084735378 +0000 UTC m=+2076.777539386" observedRunningTime="2025-10-09 08:20:46.636157755 +0000 UTC m=+2077.328961763" watchObservedRunningTime="2025-10-09 08:20:46.644949707 +0000 UTC m=+2077.337753755" Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.064122 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hv5qr"] Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.066935 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.074485 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hv5qr"] Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.237261 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51529f49-d087-4e18-a94e-178be19b5214-utilities\") pod \"community-operators-hv5qr\" (UID: \"51529f49-d087-4e18-a94e-178be19b5214\") " pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.237301 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51529f49-d087-4e18-a94e-178be19b5214-catalog-content\") pod \"community-operators-hv5qr\" (UID: \"51529f49-d087-4e18-a94e-178be19b5214\") " pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.237368 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84n5l\" (UniqueName: \"kubernetes.io/projected/51529f49-d087-4e18-a94e-178be19b5214-kube-api-access-84n5l\") pod \"community-operators-hv5qr\" (UID: \"51529f49-d087-4e18-a94e-178be19b5214\") " pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.339467 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51529f49-d087-4e18-a94e-178be19b5214-utilities\") pod \"community-operators-hv5qr\" (UID: \"51529f49-d087-4e18-a94e-178be19b5214\") " pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.339515 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51529f49-d087-4e18-a94e-178be19b5214-catalog-content\") pod \"community-operators-hv5qr\" (UID: \"51529f49-d087-4e18-a94e-178be19b5214\") " pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.339551 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84n5l\" (UniqueName: \"kubernetes.io/projected/51529f49-d087-4e18-a94e-178be19b5214-kube-api-access-84n5l\") pod \"community-operators-hv5qr\" (UID: \"51529f49-d087-4e18-a94e-178be19b5214\") " pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.340129 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51529f49-d087-4e18-a94e-178be19b5214-utilities\") pod \"community-operators-hv5qr\" (UID: \"51529f49-d087-4e18-a94e-178be19b5214\") " pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.340227 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51529f49-d087-4e18-a94e-178be19b5214-catalog-content\") pod \"community-operators-hv5qr\" (UID: \"51529f49-d087-4e18-a94e-178be19b5214\") " pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.370617 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84n5l\" (UniqueName: \"kubernetes.io/projected/51529f49-d087-4e18-a94e-178be19b5214-kube-api-access-84n5l\") pod \"community-operators-hv5qr\" (UID: \"51529f49-d087-4e18-a94e-178be19b5214\") " pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.409552 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:20:53 crc kubenswrapper[4715]: I1009 08:20:53.971927 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hv5qr"] Oct 09 08:20:54 crc kubenswrapper[4715]: I1009 08:20:54.685227 4715 generic.go:334] "Generic (PLEG): container finished" podID="51529f49-d087-4e18-a94e-178be19b5214" containerID="20abc7394768aa2efb455399a344a816ede2c7a954251cb6f676bffdbbbca77c" exitCode=0 Oct 09 08:20:54 crc kubenswrapper[4715]: I1009 08:20:54.685279 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv5qr" event={"ID":"51529f49-d087-4e18-a94e-178be19b5214","Type":"ContainerDied","Data":"20abc7394768aa2efb455399a344a816ede2c7a954251cb6f676bffdbbbca77c"} Oct 09 08:20:54 crc kubenswrapper[4715]: I1009 08:20:54.685590 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv5qr" event={"ID":"51529f49-d087-4e18-a94e-178be19b5214","Type":"ContainerStarted","Data":"c56ce3601a857315274b4a014fd11f7ee1463e499b140ede8bfa7583f0b61de3"} Oct 09 08:20:55 crc kubenswrapper[4715]: I1009 08:20:55.700718 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv5qr" event={"ID":"51529f49-d087-4e18-a94e-178be19b5214","Type":"ContainerStarted","Data":"e35ab8b6e339df41bbe1fbf1a85d025630952de9527115ef1dfae198eecfd49e"} Oct 09 08:20:56 crc kubenswrapper[4715]: I1009 08:20:56.711537 4715 generic.go:334] "Generic (PLEG): container finished" podID="51529f49-d087-4e18-a94e-178be19b5214" containerID="e35ab8b6e339df41bbe1fbf1a85d025630952de9527115ef1dfae198eecfd49e" exitCode=0 Oct 09 08:20:56 crc kubenswrapper[4715]: I1009 08:20:56.711584 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv5qr" event={"ID":"51529f49-d087-4e18-a94e-178be19b5214","Type":"ContainerDied","Data":"e35ab8b6e339df41bbe1fbf1a85d025630952de9527115ef1dfae198eecfd49e"} Oct 09 08:20:57 crc kubenswrapper[4715]: I1009 08:20:57.722984 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv5qr" event={"ID":"51529f49-d087-4e18-a94e-178be19b5214","Type":"ContainerStarted","Data":"96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199"} Oct 09 08:20:57 crc kubenswrapper[4715]: I1009 08:20:57.738780 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hv5qr" podStartSLOduration=2.174732826 podStartE2EDuration="4.738760564s" podCreationTimestamp="2025-10-09 08:20:53 +0000 UTC" firstStartedPulling="2025-10-09 08:20:54.689176238 +0000 UTC m=+2085.381980246" lastFinishedPulling="2025-10-09 08:20:57.253203976 +0000 UTC m=+2087.946007984" observedRunningTime="2025-10-09 08:20:57.737364094 +0000 UTC m=+2088.430168122" watchObservedRunningTime="2025-10-09 08:20:57.738760564 +0000 UTC m=+2088.431564572" Oct 09 08:21:03 crc kubenswrapper[4715]: I1009 08:21:03.411199 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:21:03 crc kubenswrapper[4715]: I1009 08:21:03.412712 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:21:03 crc kubenswrapper[4715]: I1009 08:21:03.456793 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:21:03 crc kubenswrapper[4715]: I1009 08:21:03.837342 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:21:03 crc kubenswrapper[4715]: I1009 08:21:03.884505 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hv5qr"] Oct 09 08:21:05 crc kubenswrapper[4715]: I1009 08:21:05.809724 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hv5qr" podUID="51529f49-d087-4e18-a94e-178be19b5214" containerName="registry-server" containerID="cri-o://96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199" gracePeriod=2 Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.216719 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.276135 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51529f49-d087-4e18-a94e-178be19b5214-utilities\") pod \"51529f49-d087-4e18-a94e-178be19b5214\" (UID: \"51529f49-d087-4e18-a94e-178be19b5214\") " Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.276189 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51529f49-d087-4e18-a94e-178be19b5214-catalog-content\") pod \"51529f49-d087-4e18-a94e-178be19b5214\" (UID: \"51529f49-d087-4e18-a94e-178be19b5214\") " Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.276281 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84n5l\" (UniqueName: \"kubernetes.io/projected/51529f49-d087-4e18-a94e-178be19b5214-kube-api-access-84n5l\") pod \"51529f49-d087-4e18-a94e-178be19b5214\" (UID: \"51529f49-d087-4e18-a94e-178be19b5214\") " Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.277540 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51529f49-d087-4e18-a94e-178be19b5214-utilities" (OuterVolumeSpecName: "utilities") pod "51529f49-d087-4e18-a94e-178be19b5214" (UID: "51529f49-d087-4e18-a94e-178be19b5214"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.282836 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51529f49-d087-4e18-a94e-178be19b5214-kube-api-access-84n5l" (OuterVolumeSpecName: "kube-api-access-84n5l") pod "51529f49-d087-4e18-a94e-178be19b5214" (UID: "51529f49-d087-4e18-a94e-178be19b5214"). InnerVolumeSpecName "kube-api-access-84n5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.323313 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51529f49-d087-4e18-a94e-178be19b5214-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51529f49-d087-4e18-a94e-178be19b5214" (UID: "51529f49-d087-4e18-a94e-178be19b5214"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.378842 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51529f49-d087-4e18-a94e-178be19b5214-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.378880 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51529f49-d087-4e18-a94e-178be19b5214-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.378893 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84n5l\" (UniqueName: \"kubernetes.io/projected/51529f49-d087-4e18-a94e-178be19b5214-kube-api-access-84n5l\") on node \"crc\" DevicePath \"\"" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.823697 4715 generic.go:334] "Generic (PLEG): container finished" podID="51529f49-d087-4e18-a94e-178be19b5214" containerID="96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199" exitCode=0 Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.823771 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hv5qr" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.823787 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv5qr" event={"ID":"51529f49-d087-4e18-a94e-178be19b5214","Type":"ContainerDied","Data":"96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199"} Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.824779 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hv5qr" event={"ID":"51529f49-d087-4e18-a94e-178be19b5214","Type":"ContainerDied","Data":"c56ce3601a857315274b4a014fd11f7ee1463e499b140ede8bfa7583f0b61de3"} Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.824815 4715 scope.go:117] "RemoveContainer" containerID="96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.845301 4715 scope.go:117] "RemoveContainer" containerID="e35ab8b6e339df41bbe1fbf1a85d025630952de9527115ef1dfae198eecfd49e" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.858825 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hv5qr"] Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.868134 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hv5qr"] Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.880935 4715 scope.go:117] "RemoveContainer" containerID="20abc7394768aa2efb455399a344a816ede2c7a954251cb6f676bffdbbbca77c" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.911839 4715 scope.go:117] "RemoveContainer" containerID="96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199" Oct 09 08:21:06 crc kubenswrapper[4715]: E1009 08:21:06.912329 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199\": container with ID starting with 96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199 not found: ID does not exist" containerID="96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.912479 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199"} err="failed to get container status \"96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199\": rpc error: code = NotFound desc = could not find container \"96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199\": container with ID starting with 96307f9ebf0561825a2c209f9669101cfea60760c602258f775f870eb69b2199 not found: ID does not exist" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.912563 4715 scope.go:117] "RemoveContainer" containerID="e35ab8b6e339df41bbe1fbf1a85d025630952de9527115ef1dfae198eecfd49e" Oct 09 08:21:06 crc kubenswrapper[4715]: E1009 08:21:06.913041 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e35ab8b6e339df41bbe1fbf1a85d025630952de9527115ef1dfae198eecfd49e\": container with ID starting with e35ab8b6e339df41bbe1fbf1a85d025630952de9527115ef1dfae198eecfd49e not found: ID does not exist" containerID="e35ab8b6e339df41bbe1fbf1a85d025630952de9527115ef1dfae198eecfd49e" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.913080 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e35ab8b6e339df41bbe1fbf1a85d025630952de9527115ef1dfae198eecfd49e"} err="failed to get container status \"e35ab8b6e339df41bbe1fbf1a85d025630952de9527115ef1dfae198eecfd49e\": rpc error: code = NotFound desc = could not find container \"e35ab8b6e339df41bbe1fbf1a85d025630952de9527115ef1dfae198eecfd49e\": container with ID starting with e35ab8b6e339df41bbe1fbf1a85d025630952de9527115ef1dfae198eecfd49e not found: ID does not exist" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.913108 4715 scope.go:117] "RemoveContainer" containerID="20abc7394768aa2efb455399a344a816ede2c7a954251cb6f676bffdbbbca77c" Oct 09 08:21:06 crc kubenswrapper[4715]: E1009 08:21:06.913510 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20abc7394768aa2efb455399a344a816ede2c7a954251cb6f676bffdbbbca77c\": container with ID starting with 20abc7394768aa2efb455399a344a816ede2c7a954251cb6f676bffdbbbca77c not found: ID does not exist" containerID="20abc7394768aa2efb455399a344a816ede2c7a954251cb6f676bffdbbbca77c" Oct 09 08:21:06 crc kubenswrapper[4715]: I1009 08:21:06.913620 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20abc7394768aa2efb455399a344a816ede2c7a954251cb6f676bffdbbbca77c"} err="failed to get container status \"20abc7394768aa2efb455399a344a816ede2c7a954251cb6f676bffdbbbca77c\": rpc error: code = NotFound desc = could not find container \"20abc7394768aa2efb455399a344a816ede2c7a954251cb6f676bffdbbbca77c\": container with ID starting with 20abc7394768aa2efb455399a344a816ede2c7a954251cb6f676bffdbbbca77c not found: ID does not exist" Oct 09 08:21:08 crc kubenswrapper[4715]: I1009 08:21:08.146569 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51529f49-d087-4e18-a94e-178be19b5214" path="/var/lib/kubelet/pods/51529f49-d087-4e18-a94e-178be19b5214/volumes" Oct 09 08:21:16 crc kubenswrapper[4715]: I1009 08:21:16.753738 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:21:16 crc kubenswrapper[4715]: I1009 08:21:16.754309 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:21:18 crc kubenswrapper[4715]: I1009 08:21:18.876127 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xxccn"] Oct 09 08:21:18 crc kubenswrapper[4715]: E1009 08:21:18.877300 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51529f49-d087-4e18-a94e-178be19b5214" containerName="extract-utilities" Oct 09 08:21:18 crc kubenswrapper[4715]: I1009 08:21:18.877375 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="51529f49-d087-4e18-a94e-178be19b5214" containerName="extract-utilities" Oct 09 08:21:18 crc kubenswrapper[4715]: E1009 08:21:18.877501 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51529f49-d087-4e18-a94e-178be19b5214" containerName="extract-content" Oct 09 08:21:18 crc kubenswrapper[4715]: I1009 08:21:18.877564 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="51529f49-d087-4e18-a94e-178be19b5214" containerName="extract-content" Oct 09 08:21:18 crc kubenswrapper[4715]: E1009 08:21:18.877633 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51529f49-d087-4e18-a94e-178be19b5214" containerName="registry-server" Oct 09 08:21:18 crc kubenswrapper[4715]: I1009 08:21:18.877687 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="51529f49-d087-4e18-a94e-178be19b5214" containerName="registry-server" Oct 09 08:21:18 crc kubenswrapper[4715]: I1009 08:21:18.877935 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="51529f49-d087-4e18-a94e-178be19b5214" containerName="registry-server" Oct 09 08:21:18 crc kubenswrapper[4715]: I1009 08:21:18.879253 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:18 crc kubenswrapper[4715]: I1009 08:21:18.903220 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxccn"] Oct 09 08:21:18 crc kubenswrapper[4715]: I1009 08:21:18.911310 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9793744-5550-4b58-9ffd-4ab4d54076fc-utilities\") pod \"certified-operators-xxccn\" (UID: \"c9793744-5550-4b58-9ffd-4ab4d54076fc\") " pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:18 crc kubenswrapper[4715]: I1009 08:21:18.911881 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9793744-5550-4b58-9ffd-4ab4d54076fc-catalog-content\") pod \"certified-operators-xxccn\" (UID: \"c9793744-5550-4b58-9ffd-4ab4d54076fc\") " pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:18 crc kubenswrapper[4715]: I1009 08:21:18.911991 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6k78\" (UniqueName: \"kubernetes.io/projected/c9793744-5550-4b58-9ffd-4ab4d54076fc-kube-api-access-z6k78\") pod \"certified-operators-xxccn\" (UID: \"c9793744-5550-4b58-9ffd-4ab4d54076fc\") " pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:19 crc kubenswrapper[4715]: I1009 08:21:19.013773 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6k78\" (UniqueName: \"kubernetes.io/projected/c9793744-5550-4b58-9ffd-4ab4d54076fc-kube-api-access-z6k78\") pod \"certified-operators-xxccn\" (UID: \"c9793744-5550-4b58-9ffd-4ab4d54076fc\") " pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:19 crc kubenswrapper[4715]: I1009 08:21:19.013876 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9793744-5550-4b58-9ffd-4ab4d54076fc-utilities\") pod \"certified-operators-xxccn\" (UID: \"c9793744-5550-4b58-9ffd-4ab4d54076fc\") " pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:19 crc kubenswrapper[4715]: I1009 08:21:19.013984 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9793744-5550-4b58-9ffd-4ab4d54076fc-catalog-content\") pod \"certified-operators-xxccn\" (UID: \"c9793744-5550-4b58-9ffd-4ab4d54076fc\") " pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:19 crc kubenswrapper[4715]: I1009 08:21:19.014523 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9793744-5550-4b58-9ffd-4ab4d54076fc-catalog-content\") pod \"certified-operators-xxccn\" (UID: \"c9793744-5550-4b58-9ffd-4ab4d54076fc\") " pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:19 crc kubenswrapper[4715]: I1009 08:21:19.014854 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9793744-5550-4b58-9ffd-4ab4d54076fc-utilities\") pod \"certified-operators-xxccn\" (UID: \"c9793744-5550-4b58-9ffd-4ab4d54076fc\") " pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:19 crc kubenswrapper[4715]: I1009 08:21:19.036131 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6k78\" (UniqueName: \"kubernetes.io/projected/c9793744-5550-4b58-9ffd-4ab4d54076fc-kube-api-access-z6k78\") pod \"certified-operators-xxccn\" (UID: \"c9793744-5550-4b58-9ffd-4ab4d54076fc\") " pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:19 crc kubenswrapper[4715]: I1009 08:21:19.249460 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:19 crc kubenswrapper[4715]: I1009 08:21:19.756944 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxccn"] Oct 09 08:21:19 crc kubenswrapper[4715]: I1009 08:21:19.944076 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxccn" event={"ID":"c9793744-5550-4b58-9ffd-4ab4d54076fc","Type":"ContainerStarted","Data":"fa394dbf8771ffb0d963e3fc57345469f5f77ac79e3a68b93acfac86933b690e"} Oct 09 08:21:20 crc kubenswrapper[4715]: I1009 08:21:20.953381 4715 generic.go:334] "Generic (PLEG): container finished" podID="c9793744-5550-4b58-9ffd-4ab4d54076fc" containerID="680a93a7820a5291c3199eb2c28c398bd0c20a189d0cb1c0b88510e91a152690" exitCode=0 Oct 09 08:21:20 crc kubenswrapper[4715]: I1009 08:21:20.953579 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxccn" event={"ID":"c9793744-5550-4b58-9ffd-4ab4d54076fc","Type":"ContainerDied","Data":"680a93a7820a5291c3199eb2c28c398bd0c20a189d0cb1c0b88510e91a152690"} Oct 09 08:21:21 crc kubenswrapper[4715]: I1009 08:21:21.989905 4715 generic.go:334] "Generic (PLEG): container finished" podID="c9793744-5550-4b58-9ffd-4ab4d54076fc" containerID="eaea7e04651dd203eb1becb49aa4e6686a11240564e633e4bbdb364266665b59" exitCode=0 Oct 09 08:21:21 crc kubenswrapper[4715]: I1009 08:21:21.990001 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxccn" event={"ID":"c9793744-5550-4b58-9ffd-4ab4d54076fc","Type":"ContainerDied","Data":"eaea7e04651dd203eb1becb49aa4e6686a11240564e633e4bbdb364266665b59"} Oct 09 08:21:23 crc kubenswrapper[4715]: I1009 08:21:23.000383 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxccn" event={"ID":"c9793744-5550-4b58-9ffd-4ab4d54076fc","Type":"ContainerStarted","Data":"19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1"} Oct 09 08:21:23 crc kubenswrapper[4715]: I1009 08:21:23.022730 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xxccn" podStartSLOduration=3.436053013 podStartE2EDuration="5.022707223s" podCreationTimestamp="2025-10-09 08:21:18 +0000 UTC" firstStartedPulling="2025-10-09 08:21:20.955777495 +0000 UTC m=+2111.648581503" lastFinishedPulling="2025-10-09 08:21:22.542431705 +0000 UTC m=+2113.235235713" observedRunningTime="2025-10-09 08:21:23.017042261 +0000 UTC m=+2113.709846279" watchObservedRunningTime="2025-10-09 08:21:23.022707223 +0000 UTC m=+2113.715511231" Oct 09 08:21:29 crc kubenswrapper[4715]: I1009 08:21:29.250299 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:29 crc kubenswrapper[4715]: I1009 08:21:29.250868 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:29 crc kubenswrapper[4715]: I1009 08:21:29.330982 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:30 crc kubenswrapper[4715]: I1009 08:21:30.112166 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:30 crc kubenswrapper[4715]: I1009 08:21:30.160956 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xxccn"] Oct 09 08:21:32 crc kubenswrapper[4715]: I1009 08:21:32.082633 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xxccn" podUID="c9793744-5550-4b58-9ffd-4ab4d54076fc" containerName="registry-server" containerID="cri-o://19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1" gracePeriod=2 Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.043260 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.093210 4715 generic.go:334] "Generic (PLEG): container finished" podID="c9793744-5550-4b58-9ffd-4ab4d54076fc" containerID="19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1" exitCode=0 Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.093266 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxccn" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.093274 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxccn" event={"ID":"c9793744-5550-4b58-9ffd-4ab4d54076fc","Type":"ContainerDied","Data":"19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1"} Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.093314 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxccn" event={"ID":"c9793744-5550-4b58-9ffd-4ab4d54076fc","Type":"ContainerDied","Data":"fa394dbf8771ffb0d963e3fc57345469f5f77ac79e3a68b93acfac86933b690e"} Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.093337 4715 scope.go:117] "RemoveContainer" containerID="19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.101900 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9793744-5550-4b58-9ffd-4ab4d54076fc-utilities\") pod \"c9793744-5550-4b58-9ffd-4ab4d54076fc\" (UID: \"c9793744-5550-4b58-9ffd-4ab4d54076fc\") " Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.101998 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6k78\" (UniqueName: \"kubernetes.io/projected/c9793744-5550-4b58-9ffd-4ab4d54076fc-kube-api-access-z6k78\") pod \"c9793744-5550-4b58-9ffd-4ab4d54076fc\" (UID: \"c9793744-5550-4b58-9ffd-4ab4d54076fc\") " Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.102442 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9793744-5550-4b58-9ffd-4ab4d54076fc-catalog-content\") pod \"c9793744-5550-4b58-9ffd-4ab4d54076fc\" (UID: \"c9793744-5550-4b58-9ffd-4ab4d54076fc\") " Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.105646 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9793744-5550-4b58-9ffd-4ab4d54076fc-utilities" (OuterVolumeSpecName: "utilities") pod "c9793744-5550-4b58-9ffd-4ab4d54076fc" (UID: "c9793744-5550-4b58-9ffd-4ab4d54076fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.110747 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9793744-5550-4b58-9ffd-4ab4d54076fc-kube-api-access-z6k78" (OuterVolumeSpecName: "kube-api-access-z6k78") pod "c9793744-5550-4b58-9ffd-4ab4d54076fc" (UID: "c9793744-5550-4b58-9ffd-4ab4d54076fc"). InnerVolumeSpecName "kube-api-access-z6k78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.111222 4715 scope.go:117] "RemoveContainer" containerID="eaea7e04651dd203eb1becb49aa4e6686a11240564e633e4bbdb364266665b59" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.153147 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9793744-5550-4b58-9ffd-4ab4d54076fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9793744-5550-4b58-9ffd-4ab4d54076fc" (UID: "c9793744-5550-4b58-9ffd-4ab4d54076fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.179967 4715 scope.go:117] "RemoveContainer" containerID="680a93a7820a5291c3199eb2c28c398bd0c20a189d0cb1c0b88510e91a152690" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.205324 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9793744-5550-4b58-9ffd-4ab4d54076fc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.205363 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9793744-5550-4b58-9ffd-4ab4d54076fc-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.205377 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6k78\" (UniqueName: \"kubernetes.io/projected/c9793744-5550-4b58-9ffd-4ab4d54076fc-kube-api-access-z6k78\") on node \"crc\" DevicePath \"\"" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.223920 4715 scope.go:117] "RemoveContainer" containerID="19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1" Oct 09 08:21:33 crc kubenswrapper[4715]: E1009 08:21:33.226704 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1\": container with ID starting with 19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1 not found: ID does not exist" containerID="19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.226750 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1"} err="failed to get container status \"19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1\": rpc error: code = NotFound desc = could not find container \"19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1\": container with ID starting with 19187f44c04e920e41c56b8d6ea10e203ea3abb219a0168de5899e5ee41acff1 not found: ID does not exist" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.226778 4715 scope.go:117] "RemoveContainer" containerID="eaea7e04651dd203eb1becb49aa4e6686a11240564e633e4bbdb364266665b59" Oct 09 08:21:33 crc kubenswrapper[4715]: E1009 08:21:33.227122 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaea7e04651dd203eb1becb49aa4e6686a11240564e633e4bbdb364266665b59\": container with ID starting with eaea7e04651dd203eb1becb49aa4e6686a11240564e633e4bbdb364266665b59 not found: ID does not exist" containerID="eaea7e04651dd203eb1becb49aa4e6686a11240564e633e4bbdb364266665b59" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.227149 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaea7e04651dd203eb1becb49aa4e6686a11240564e633e4bbdb364266665b59"} err="failed to get container status \"eaea7e04651dd203eb1becb49aa4e6686a11240564e633e4bbdb364266665b59\": rpc error: code = NotFound desc = could not find container \"eaea7e04651dd203eb1becb49aa4e6686a11240564e633e4bbdb364266665b59\": container with ID starting with eaea7e04651dd203eb1becb49aa4e6686a11240564e633e4bbdb364266665b59 not found: ID does not exist" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.227166 4715 scope.go:117] "RemoveContainer" containerID="680a93a7820a5291c3199eb2c28c398bd0c20a189d0cb1c0b88510e91a152690" Oct 09 08:21:33 crc kubenswrapper[4715]: E1009 08:21:33.227378 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680a93a7820a5291c3199eb2c28c398bd0c20a189d0cb1c0b88510e91a152690\": container with ID starting with 680a93a7820a5291c3199eb2c28c398bd0c20a189d0cb1c0b88510e91a152690 not found: ID does not exist" containerID="680a93a7820a5291c3199eb2c28c398bd0c20a189d0cb1c0b88510e91a152690" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.227398 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680a93a7820a5291c3199eb2c28c398bd0c20a189d0cb1c0b88510e91a152690"} err="failed to get container status \"680a93a7820a5291c3199eb2c28c398bd0c20a189d0cb1c0b88510e91a152690\": rpc error: code = NotFound desc = could not find container \"680a93a7820a5291c3199eb2c28c398bd0c20a189d0cb1c0b88510e91a152690\": container with ID starting with 680a93a7820a5291c3199eb2c28c398bd0c20a189d0cb1c0b88510e91a152690 not found: ID does not exist" Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.432472 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xxccn"] Oct 09 08:21:33 crc kubenswrapper[4715]: I1009 08:21:33.443191 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xxccn"] Oct 09 08:21:34 crc kubenswrapper[4715]: I1009 08:21:34.165188 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9793744-5550-4b58-9ffd-4ab4d54076fc" path="/var/lib/kubelet/pods/c9793744-5550-4b58-9ffd-4ab4d54076fc/volumes" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.538176 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w95fb"] Oct 09 08:21:42 crc kubenswrapper[4715]: E1009 08:21:42.539015 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9793744-5550-4b58-9ffd-4ab4d54076fc" containerName="extract-utilities" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.539089 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9793744-5550-4b58-9ffd-4ab4d54076fc" containerName="extract-utilities" Oct 09 08:21:42 crc kubenswrapper[4715]: E1009 08:21:42.539104 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9793744-5550-4b58-9ffd-4ab4d54076fc" containerName="extract-content" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.539110 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9793744-5550-4b58-9ffd-4ab4d54076fc" containerName="extract-content" Oct 09 08:21:42 crc kubenswrapper[4715]: E1009 08:21:42.539129 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9793744-5550-4b58-9ffd-4ab4d54076fc" containerName="registry-server" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.539135 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9793744-5550-4b58-9ffd-4ab4d54076fc" containerName="registry-server" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.539306 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9793744-5550-4b58-9ffd-4ab4d54076fc" containerName="registry-server" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.540738 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.558522 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w95fb"] Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.575884 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-utilities\") pod \"redhat-marketplace-w95fb\" (UID: \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\") " pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.576024 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-catalog-content\") pod \"redhat-marketplace-w95fb\" (UID: \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\") " pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.576074 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhz4v\" (UniqueName: \"kubernetes.io/projected/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-kube-api-access-nhz4v\") pod \"redhat-marketplace-w95fb\" (UID: \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\") " pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.677528 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-utilities\") pod \"redhat-marketplace-w95fb\" (UID: \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\") " pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.677599 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-catalog-content\") pod \"redhat-marketplace-w95fb\" (UID: \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\") " pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.677624 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhz4v\" (UniqueName: \"kubernetes.io/projected/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-kube-api-access-nhz4v\") pod \"redhat-marketplace-w95fb\" (UID: \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\") " pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.678125 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-catalog-content\") pod \"redhat-marketplace-w95fb\" (UID: \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\") " pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.678132 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-utilities\") pod \"redhat-marketplace-w95fb\" (UID: \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\") " pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.707340 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhz4v\" (UniqueName: \"kubernetes.io/projected/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-kube-api-access-nhz4v\") pod \"redhat-marketplace-w95fb\" (UID: \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\") " pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:42 crc kubenswrapper[4715]: I1009 08:21:42.858296 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:43 crc kubenswrapper[4715]: I1009 08:21:43.336372 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w95fb"] Oct 09 08:21:44 crc kubenswrapper[4715]: I1009 08:21:44.198889 4715 generic.go:334] "Generic (PLEG): container finished" podID="cc4c8ed2-08c6-42f7-a19d-226fc98143f6" containerID="c5e819cf6ad802e0fa2f833be004fdd052a08af2733c1d3d9d1f821d2b9cb5fb" exitCode=0 Oct 09 08:21:44 crc kubenswrapper[4715]: I1009 08:21:44.198947 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w95fb" event={"ID":"cc4c8ed2-08c6-42f7-a19d-226fc98143f6","Type":"ContainerDied","Data":"c5e819cf6ad802e0fa2f833be004fdd052a08af2733c1d3d9d1f821d2b9cb5fb"} Oct 09 08:21:44 crc kubenswrapper[4715]: I1009 08:21:44.199195 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w95fb" event={"ID":"cc4c8ed2-08c6-42f7-a19d-226fc98143f6","Type":"ContainerStarted","Data":"656f563c5732cccc940a1167aeac944e2aa79052be16ba728c697182c48b2897"} Oct 09 08:21:46 crc kubenswrapper[4715]: I1009 08:21:46.239520 4715 generic.go:334] "Generic (PLEG): container finished" podID="cc4c8ed2-08c6-42f7-a19d-226fc98143f6" containerID="521133e16f5c05e61ab7ee2744ab641847f07a3bf938158d3ce533d080287d62" exitCode=0 Oct 09 08:21:46 crc kubenswrapper[4715]: I1009 08:21:46.239694 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w95fb" event={"ID":"cc4c8ed2-08c6-42f7-a19d-226fc98143f6","Type":"ContainerDied","Data":"521133e16f5c05e61ab7ee2744ab641847f07a3bf938158d3ce533d080287d62"} Oct 09 08:21:46 crc kubenswrapper[4715]: I1009 08:21:46.753778 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:21:46 crc kubenswrapper[4715]: I1009 08:21:46.753859 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:21:47 crc kubenswrapper[4715]: I1009 08:21:47.252167 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w95fb" event={"ID":"cc4c8ed2-08c6-42f7-a19d-226fc98143f6","Type":"ContainerStarted","Data":"d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138"} Oct 09 08:21:47 crc kubenswrapper[4715]: I1009 08:21:47.274561 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w95fb" podStartSLOduration=2.483658385 podStartE2EDuration="5.274540936s" podCreationTimestamp="2025-10-09 08:21:42 +0000 UTC" firstStartedPulling="2025-10-09 08:21:44.201452735 +0000 UTC m=+2134.894256753" lastFinishedPulling="2025-10-09 08:21:46.992335296 +0000 UTC m=+2137.685139304" observedRunningTime="2025-10-09 08:21:47.269164461 +0000 UTC m=+2137.961968469" watchObservedRunningTime="2025-10-09 08:21:47.274540936 +0000 UTC m=+2137.967344944" Oct 09 08:21:49 crc kubenswrapper[4715]: I1009 08:21:49.922895 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-crx79"] Oct 09 08:21:49 crc kubenswrapper[4715]: I1009 08:21:49.927940 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:21:49 crc kubenswrapper[4715]: I1009 08:21:49.946781 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-crx79"] Oct 09 08:21:50 crc kubenswrapper[4715]: I1009 08:21:50.030989 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40273519-8a0a-441d-a416-78e0c824eb04-catalog-content\") pod \"redhat-operators-crx79\" (UID: \"40273519-8a0a-441d-a416-78e0c824eb04\") " pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:21:50 crc kubenswrapper[4715]: I1009 08:21:50.031251 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrxl2\" (UniqueName: \"kubernetes.io/projected/40273519-8a0a-441d-a416-78e0c824eb04-kube-api-access-jrxl2\") pod \"redhat-operators-crx79\" (UID: \"40273519-8a0a-441d-a416-78e0c824eb04\") " pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:21:50 crc kubenswrapper[4715]: I1009 08:21:50.031406 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40273519-8a0a-441d-a416-78e0c824eb04-utilities\") pod \"redhat-operators-crx79\" (UID: \"40273519-8a0a-441d-a416-78e0c824eb04\") " pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:21:50 crc kubenswrapper[4715]: I1009 08:21:50.133168 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40273519-8a0a-441d-a416-78e0c824eb04-catalog-content\") pod \"redhat-operators-crx79\" (UID: \"40273519-8a0a-441d-a416-78e0c824eb04\") " pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:21:50 crc kubenswrapper[4715]: I1009 08:21:50.133339 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrxl2\" (UniqueName: \"kubernetes.io/projected/40273519-8a0a-441d-a416-78e0c824eb04-kube-api-access-jrxl2\") pod \"redhat-operators-crx79\" (UID: \"40273519-8a0a-441d-a416-78e0c824eb04\") " pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:21:50 crc kubenswrapper[4715]: I1009 08:21:50.133456 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40273519-8a0a-441d-a416-78e0c824eb04-utilities\") pod \"redhat-operators-crx79\" (UID: \"40273519-8a0a-441d-a416-78e0c824eb04\") " pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:21:50 crc kubenswrapper[4715]: I1009 08:21:50.133950 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40273519-8a0a-441d-a416-78e0c824eb04-catalog-content\") pod \"redhat-operators-crx79\" (UID: \"40273519-8a0a-441d-a416-78e0c824eb04\") " pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:21:50 crc kubenswrapper[4715]: I1009 08:21:50.134031 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40273519-8a0a-441d-a416-78e0c824eb04-utilities\") pod \"redhat-operators-crx79\" (UID: \"40273519-8a0a-441d-a416-78e0c824eb04\") " pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:21:50 crc kubenswrapper[4715]: I1009 08:21:50.158787 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrxl2\" (UniqueName: \"kubernetes.io/projected/40273519-8a0a-441d-a416-78e0c824eb04-kube-api-access-jrxl2\") pod \"redhat-operators-crx79\" (UID: \"40273519-8a0a-441d-a416-78e0c824eb04\") " pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:21:50 crc kubenswrapper[4715]: I1009 08:21:50.254929 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:21:50 crc kubenswrapper[4715]: I1009 08:21:50.708794 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-crx79"] Oct 09 08:21:50 crc kubenswrapper[4715]: W1009 08:21:50.718608 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40273519_8a0a_441d_a416_78e0c824eb04.slice/crio-c7bd750d2484c5019b6e09385751e5062e062a19c9d09fbf00a8566565d0ab98 WatchSource:0}: Error finding container c7bd750d2484c5019b6e09385751e5062e062a19c9d09fbf00a8566565d0ab98: Status 404 returned error can't find the container with id c7bd750d2484c5019b6e09385751e5062e062a19c9d09fbf00a8566565d0ab98 Oct 09 08:21:51 crc kubenswrapper[4715]: I1009 08:21:51.290507 4715 generic.go:334] "Generic (PLEG): container finished" podID="40273519-8a0a-441d-a416-78e0c824eb04" containerID="231dc6e709138951badeef8772ecb98b8707c9beecfd6a863246144b07d1b3d5" exitCode=0 Oct 09 08:21:51 crc kubenswrapper[4715]: I1009 08:21:51.290790 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crx79" event={"ID":"40273519-8a0a-441d-a416-78e0c824eb04","Type":"ContainerDied","Data":"231dc6e709138951badeef8772ecb98b8707c9beecfd6a863246144b07d1b3d5"} Oct 09 08:21:51 crc kubenswrapper[4715]: I1009 08:21:51.290822 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crx79" event={"ID":"40273519-8a0a-441d-a416-78e0c824eb04","Type":"ContainerStarted","Data":"c7bd750d2484c5019b6e09385751e5062e062a19c9d09fbf00a8566565d0ab98"} Oct 09 08:21:52 crc kubenswrapper[4715]: I1009 08:21:52.302005 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crx79" event={"ID":"40273519-8a0a-441d-a416-78e0c824eb04","Type":"ContainerStarted","Data":"273905187cb708c3908e382cc8ec5d821eb3ff3306ed4a046e6ce50eede5a62e"} Oct 09 08:21:52 crc kubenswrapper[4715]: I1009 08:21:52.859238 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:52 crc kubenswrapper[4715]: I1009 08:21:52.859576 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:52 crc kubenswrapper[4715]: I1009 08:21:52.907517 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:53 crc kubenswrapper[4715]: I1009 08:21:53.317292 4715 generic.go:334] "Generic (PLEG): container finished" podID="40273519-8a0a-441d-a416-78e0c824eb04" containerID="273905187cb708c3908e382cc8ec5d821eb3ff3306ed4a046e6ce50eede5a62e" exitCode=0 Oct 09 08:21:53 crc kubenswrapper[4715]: I1009 08:21:53.317414 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crx79" event={"ID":"40273519-8a0a-441d-a416-78e0c824eb04","Type":"ContainerDied","Data":"273905187cb708c3908e382cc8ec5d821eb3ff3306ed4a046e6ce50eede5a62e"} Oct 09 08:21:53 crc kubenswrapper[4715]: I1009 08:21:53.367202 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:54 crc kubenswrapper[4715]: I1009 08:21:54.327506 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crx79" event={"ID":"40273519-8a0a-441d-a416-78e0c824eb04","Type":"ContainerStarted","Data":"efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094"} Oct 09 08:21:54 crc kubenswrapper[4715]: I1009 08:21:54.350647 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-crx79" podStartSLOduration=2.76241382 podStartE2EDuration="5.350627753s" podCreationTimestamp="2025-10-09 08:21:49 +0000 UTC" firstStartedPulling="2025-10-09 08:21:51.293096608 +0000 UTC m=+2141.985900626" lastFinishedPulling="2025-10-09 08:21:53.881310541 +0000 UTC m=+2144.574114559" observedRunningTime="2025-10-09 08:21:54.34321044 +0000 UTC m=+2145.036014458" watchObservedRunningTime="2025-10-09 08:21:54.350627753 +0000 UTC m=+2145.043431761" Oct 09 08:21:55 crc kubenswrapper[4715]: I1009 08:21:55.321064 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w95fb"] Oct 09 08:21:56 crc kubenswrapper[4715]: I1009 08:21:56.347729 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w95fb" podUID="cc4c8ed2-08c6-42f7-a19d-226fc98143f6" containerName="registry-server" containerID="cri-o://d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138" gracePeriod=2 Oct 09 08:21:56 crc kubenswrapper[4715]: I1009 08:21:56.788747 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:56 crc kubenswrapper[4715]: I1009 08:21:56.867605 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-utilities\") pod \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\" (UID: \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\") " Oct 09 08:21:56 crc kubenswrapper[4715]: I1009 08:21:56.867960 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhz4v\" (UniqueName: \"kubernetes.io/projected/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-kube-api-access-nhz4v\") pod \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\" (UID: \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\") " Oct 09 08:21:56 crc kubenswrapper[4715]: I1009 08:21:56.868092 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-catalog-content\") pod \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\" (UID: \"cc4c8ed2-08c6-42f7-a19d-226fc98143f6\") " Oct 09 08:21:56 crc kubenswrapper[4715]: I1009 08:21:56.868325 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-utilities" (OuterVolumeSpecName: "utilities") pod "cc4c8ed2-08c6-42f7-a19d-226fc98143f6" (UID: "cc4c8ed2-08c6-42f7-a19d-226fc98143f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:21:56 crc kubenswrapper[4715]: I1009 08:21:56.868560 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:21:56 crc kubenswrapper[4715]: I1009 08:21:56.875212 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-kube-api-access-nhz4v" (OuterVolumeSpecName: "kube-api-access-nhz4v") pod "cc4c8ed2-08c6-42f7-a19d-226fc98143f6" (UID: "cc4c8ed2-08c6-42f7-a19d-226fc98143f6"). InnerVolumeSpecName "kube-api-access-nhz4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:21:56 crc kubenswrapper[4715]: I1009 08:21:56.886729 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc4c8ed2-08c6-42f7-a19d-226fc98143f6" (UID: "cc4c8ed2-08c6-42f7-a19d-226fc98143f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:21:56 crc kubenswrapper[4715]: I1009 08:21:56.970844 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhz4v\" (UniqueName: \"kubernetes.io/projected/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-kube-api-access-nhz4v\") on node \"crc\" DevicePath \"\"" Oct 09 08:21:56 crc kubenswrapper[4715]: I1009 08:21:56.970905 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc4c8ed2-08c6-42f7-a19d-226fc98143f6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.359081 4715 generic.go:334] "Generic (PLEG): container finished" podID="cc4c8ed2-08c6-42f7-a19d-226fc98143f6" containerID="d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138" exitCode=0 Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.359154 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w95fb" event={"ID":"cc4c8ed2-08c6-42f7-a19d-226fc98143f6","Type":"ContainerDied","Data":"d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138"} Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.359186 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w95fb" Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.360241 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w95fb" event={"ID":"cc4c8ed2-08c6-42f7-a19d-226fc98143f6","Type":"ContainerDied","Data":"656f563c5732cccc940a1167aeac944e2aa79052be16ba728c697182c48b2897"} Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.360322 4715 scope.go:117] "RemoveContainer" containerID="d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138" Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.381740 4715 scope.go:117] "RemoveContainer" containerID="521133e16f5c05e61ab7ee2744ab641847f07a3bf938158d3ce533d080287d62" Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.398566 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w95fb"] Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.406980 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w95fb"] Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.409770 4715 scope.go:117] "RemoveContainer" containerID="c5e819cf6ad802e0fa2f833be004fdd052a08af2733c1d3d9d1f821d2b9cb5fb" Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.451346 4715 scope.go:117] "RemoveContainer" containerID="d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138" Oct 09 08:21:57 crc kubenswrapper[4715]: E1009 08:21:57.451932 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138\": container with ID starting with d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138 not found: ID does not exist" containerID="d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138" Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.451973 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138"} err="failed to get container status \"d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138\": rpc error: code = NotFound desc = could not find container \"d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138\": container with ID starting with d2edcad3a91ac525b9c001e52ba2a316123e053e27e90eb9953c2b42d2e69138 not found: ID does not exist" Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.452001 4715 scope.go:117] "RemoveContainer" containerID="521133e16f5c05e61ab7ee2744ab641847f07a3bf938158d3ce533d080287d62" Oct 09 08:21:57 crc kubenswrapper[4715]: E1009 08:21:57.452287 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"521133e16f5c05e61ab7ee2744ab641847f07a3bf938158d3ce533d080287d62\": container with ID starting with 521133e16f5c05e61ab7ee2744ab641847f07a3bf938158d3ce533d080287d62 not found: ID does not exist" containerID="521133e16f5c05e61ab7ee2744ab641847f07a3bf938158d3ce533d080287d62" Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.452314 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521133e16f5c05e61ab7ee2744ab641847f07a3bf938158d3ce533d080287d62"} err="failed to get container status \"521133e16f5c05e61ab7ee2744ab641847f07a3bf938158d3ce533d080287d62\": rpc error: code = NotFound desc = could not find container \"521133e16f5c05e61ab7ee2744ab641847f07a3bf938158d3ce533d080287d62\": container with ID starting with 521133e16f5c05e61ab7ee2744ab641847f07a3bf938158d3ce533d080287d62 not found: ID does not exist" Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.452332 4715 scope.go:117] "RemoveContainer" containerID="c5e819cf6ad802e0fa2f833be004fdd052a08af2733c1d3d9d1f821d2b9cb5fb" Oct 09 08:21:57 crc kubenswrapper[4715]: E1009 08:21:57.452614 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e819cf6ad802e0fa2f833be004fdd052a08af2733c1d3d9d1f821d2b9cb5fb\": container with ID starting with c5e819cf6ad802e0fa2f833be004fdd052a08af2733c1d3d9d1f821d2b9cb5fb not found: ID does not exist" containerID="c5e819cf6ad802e0fa2f833be004fdd052a08af2733c1d3d9d1f821d2b9cb5fb" Oct 09 08:21:57 crc kubenswrapper[4715]: I1009 08:21:57.452657 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e819cf6ad802e0fa2f833be004fdd052a08af2733c1d3d9d1f821d2b9cb5fb"} err="failed to get container status \"c5e819cf6ad802e0fa2f833be004fdd052a08af2733c1d3d9d1f821d2b9cb5fb\": rpc error: code = NotFound desc = could not find container \"c5e819cf6ad802e0fa2f833be004fdd052a08af2733c1d3d9d1f821d2b9cb5fb\": container with ID starting with c5e819cf6ad802e0fa2f833be004fdd052a08af2733c1d3d9d1f821d2b9cb5fb not found: ID does not exist" Oct 09 08:21:58 crc kubenswrapper[4715]: I1009 08:21:58.149736 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc4c8ed2-08c6-42f7-a19d-226fc98143f6" path="/var/lib/kubelet/pods/cc4c8ed2-08c6-42f7-a19d-226fc98143f6/volumes" Oct 09 08:22:00 crc kubenswrapper[4715]: I1009 08:22:00.256503 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:22:00 crc kubenswrapper[4715]: I1009 08:22:00.257722 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:22:00 crc kubenswrapper[4715]: I1009 08:22:00.316973 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:22:00 crc kubenswrapper[4715]: I1009 08:22:00.453320 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:22:01 crc kubenswrapper[4715]: I1009 08:22:01.319430 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-crx79"] Oct 09 08:22:02 crc kubenswrapper[4715]: I1009 08:22:02.405666 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-crx79" podUID="40273519-8a0a-441d-a416-78e0c824eb04" containerName="registry-server" containerID="cri-o://efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094" gracePeriod=2 Oct 09 08:22:02 crc kubenswrapper[4715]: I1009 08:22:02.828776 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:22:02 crc kubenswrapper[4715]: I1009 08:22:02.876764 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrxl2\" (UniqueName: \"kubernetes.io/projected/40273519-8a0a-441d-a416-78e0c824eb04-kube-api-access-jrxl2\") pod \"40273519-8a0a-441d-a416-78e0c824eb04\" (UID: \"40273519-8a0a-441d-a416-78e0c824eb04\") " Oct 09 08:22:02 crc kubenswrapper[4715]: I1009 08:22:02.876842 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40273519-8a0a-441d-a416-78e0c824eb04-catalog-content\") pod \"40273519-8a0a-441d-a416-78e0c824eb04\" (UID: \"40273519-8a0a-441d-a416-78e0c824eb04\") " Oct 09 08:22:02 crc kubenswrapper[4715]: I1009 08:22:02.876899 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40273519-8a0a-441d-a416-78e0c824eb04-utilities\") pod \"40273519-8a0a-441d-a416-78e0c824eb04\" (UID: \"40273519-8a0a-441d-a416-78e0c824eb04\") " Oct 09 08:22:02 crc kubenswrapper[4715]: I1009 08:22:02.878356 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40273519-8a0a-441d-a416-78e0c824eb04-utilities" (OuterVolumeSpecName: "utilities") pod "40273519-8a0a-441d-a416-78e0c824eb04" (UID: "40273519-8a0a-441d-a416-78e0c824eb04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:22:02 crc kubenswrapper[4715]: I1009 08:22:02.901974 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40273519-8a0a-441d-a416-78e0c824eb04-kube-api-access-jrxl2" (OuterVolumeSpecName: "kube-api-access-jrxl2") pod "40273519-8a0a-441d-a416-78e0c824eb04" (UID: "40273519-8a0a-441d-a416-78e0c824eb04"). InnerVolumeSpecName "kube-api-access-jrxl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:22:02 crc kubenswrapper[4715]: I1009 08:22:02.958097 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40273519-8a0a-441d-a416-78e0c824eb04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40273519-8a0a-441d-a416-78e0c824eb04" (UID: "40273519-8a0a-441d-a416-78e0c824eb04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:22:02 crc kubenswrapper[4715]: I1009 08:22:02.978620 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrxl2\" (UniqueName: \"kubernetes.io/projected/40273519-8a0a-441d-a416-78e0c824eb04-kube-api-access-jrxl2\") on node \"crc\" DevicePath \"\"" Oct 09 08:22:02 crc kubenswrapper[4715]: I1009 08:22:02.978657 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40273519-8a0a-441d-a416-78e0c824eb04-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:22:02 crc kubenswrapper[4715]: I1009 08:22:02.978667 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40273519-8a0a-441d-a416-78e0c824eb04-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.423980 4715 generic.go:334] "Generic (PLEG): container finished" podID="40273519-8a0a-441d-a416-78e0c824eb04" containerID="efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094" exitCode=0 Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.424031 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crx79" event={"ID":"40273519-8a0a-441d-a416-78e0c824eb04","Type":"ContainerDied","Data":"efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094"} Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.424066 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-crx79" event={"ID":"40273519-8a0a-441d-a416-78e0c824eb04","Type":"ContainerDied","Data":"c7bd750d2484c5019b6e09385751e5062e062a19c9d09fbf00a8566565d0ab98"} Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.424089 4715 scope.go:117] "RemoveContainer" containerID="efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094" Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.424246 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-crx79" Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.459811 4715 scope.go:117] "RemoveContainer" containerID="273905187cb708c3908e382cc8ec5d821eb3ff3306ed4a046e6ce50eede5a62e" Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.467828 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-crx79"] Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.485617 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-crx79"] Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.489299 4715 scope.go:117] "RemoveContainer" containerID="231dc6e709138951badeef8772ecb98b8707c9beecfd6a863246144b07d1b3d5" Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.532332 4715 scope.go:117] "RemoveContainer" containerID="efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094" Oct 09 08:22:03 crc kubenswrapper[4715]: E1009 08:22:03.532861 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094\": container with ID starting with efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094 not found: ID does not exist" containerID="efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094" Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.532889 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094"} err="failed to get container status \"efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094\": rpc error: code = NotFound desc = could not find container \"efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094\": container with ID starting with efdd424018b65e4caa9242f866df652d55c89ba65cfdfbbfd14dd4ff64231094 not found: ID does not exist" Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.532930 4715 scope.go:117] "RemoveContainer" containerID="273905187cb708c3908e382cc8ec5d821eb3ff3306ed4a046e6ce50eede5a62e" Oct 09 08:22:03 crc kubenswrapper[4715]: E1009 08:22:03.533345 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"273905187cb708c3908e382cc8ec5d821eb3ff3306ed4a046e6ce50eede5a62e\": container with ID starting with 273905187cb708c3908e382cc8ec5d821eb3ff3306ed4a046e6ce50eede5a62e not found: ID does not exist" containerID="273905187cb708c3908e382cc8ec5d821eb3ff3306ed4a046e6ce50eede5a62e" Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.533376 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"273905187cb708c3908e382cc8ec5d821eb3ff3306ed4a046e6ce50eede5a62e"} err="failed to get container status \"273905187cb708c3908e382cc8ec5d821eb3ff3306ed4a046e6ce50eede5a62e\": rpc error: code = NotFound desc = could not find container \"273905187cb708c3908e382cc8ec5d821eb3ff3306ed4a046e6ce50eede5a62e\": container with ID starting with 273905187cb708c3908e382cc8ec5d821eb3ff3306ed4a046e6ce50eede5a62e not found: ID does not exist" Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.533392 4715 scope.go:117] "RemoveContainer" containerID="231dc6e709138951badeef8772ecb98b8707c9beecfd6a863246144b07d1b3d5" Oct 09 08:22:03 crc kubenswrapper[4715]: E1009 08:22:03.533721 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"231dc6e709138951badeef8772ecb98b8707c9beecfd6a863246144b07d1b3d5\": container with ID starting with 231dc6e709138951badeef8772ecb98b8707c9beecfd6a863246144b07d1b3d5 not found: ID does not exist" containerID="231dc6e709138951badeef8772ecb98b8707c9beecfd6a863246144b07d1b3d5" Oct 09 08:22:03 crc kubenswrapper[4715]: I1009 08:22:03.533743 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"231dc6e709138951badeef8772ecb98b8707c9beecfd6a863246144b07d1b3d5"} err="failed to get container status \"231dc6e709138951badeef8772ecb98b8707c9beecfd6a863246144b07d1b3d5\": rpc error: code = NotFound desc = could not find container \"231dc6e709138951badeef8772ecb98b8707c9beecfd6a863246144b07d1b3d5\": container with ID starting with 231dc6e709138951badeef8772ecb98b8707c9beecfd6a863246144b07d1b3d5 not found: ID does not exist" Oct 09 08:22:03 crc kubenswrapper[4715]: E1009 08:22:03.548162 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40273519_8a0a_441d_a416_78e0c824eb04.slice\": RecentStats: unable to find data in memory cache]" Oct 09 08:22:04 crc kubenswrapper[4715]: I1009 08:22:04.157715 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40273519-8a0a-441d-a416-78e0c824eb04" path="/var/lib/kubelet/pods/40273519-8a0a-441d-a416-78e0c824eb04/volumes" Oct 09 08:22:16 crc kubenswrapper[4715]: I1009 08:22:16.754082 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:22:16 crc kubenswrapper[4715]: I1009 08:22:16.755505 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:22:16 crc kubenswrapper[4715]: I1009 08:22:16.755556 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 08:22:16 crc kubenswrapper[4715]: I1009 08:22:16.756289 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"015af87f013fa22d235ca7ed867a3852e276c3f820b05ec66959a6cef5662490"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 08:22:16 crc kubenswrapper[4715]: I1009 08:22:16.756349 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://015af87f013fa22d235ca7ed867a3852e276c3f820b05ec66959a6cef5662490" gracePeriod=600 Oct 09 08:22:17 crc kubenswrapper[4715]: I1009 08:22:17.560073 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="015af87f013fa22d235ca7ed867a3852e276c3f820b05ec66959a6cef5662490" exitCode=0 Oct 09 08:22:17 crc kubenswrapper[4715]: I1009 08:22:17.560185 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"015af87f013fa22d235ca7ed867a3852e276c3f820b05ec66959a6cef5662490"} Oct 09 08:22:17 crc kubenswrapper[4715]: I1009 08:22:17.560621 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28"} Oct 09 08:22:17 crc kubenswrapper[4715]: I1009 08:22:17.560643 4715 scope.go:117] "RemoveContainer" containerID="603ae8e76a989f73d5fee395ce2ebf8db256706e70cbcec215884dc9ad047a0c" Oct 09 08:24:46 crc kubenswrapper[4715]: I1009 08:24:46.754205 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:24:46 crc kubenswrapper[4715]: I1009 08:24:46.755152 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:24:56 crc kubenswrapper[4715]: I1009 08:24:56.030651 4715 generic.go:334] "Generic (PLEG): container finished" podID="767ac586-f48e-410c-a5bb-589eccbef2c8" containerID="c69e342300596bd0764371bb74d9877df29bdfff90d5a0394d013ea5cdd58ce5" exitCode=0 Oct 09 08:24:56 crc kubenswrapper[4715]: I1009 08:24:56.030740 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" event={"ID":"767ac586-f48e-410c-a5bb-589eccbef2c8","Type":"ContainerDied","Data":"c69e342300596bd0764371bb74d9877df29bdfff90d5a0394d013ea5cdd58ce5"} Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.570525 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.643767 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-ssh-key\") pod \"767ac586-f48e-410c-a5bb-589eccbef2c8\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.643825 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-libvirt-secret-0\") pod \"767ac586-f48e-410c-a5bb-589eccbef2c8\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.643867 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-inventory\") pod \"767ac586-f48e-410c-a5bb-589eccbef2c8\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.643952 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z68dt\" (UniqueName: \"kubernetes.io/projected/767ac586-f48e-410c-a5bb-589eccbef2c8-kube-api-access-z68dt\") pod \"767ac586-f48e-410c-a5bb-589eccbef2c8\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.644033 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-libvirt-combined-ca-bundle\") pod \"767ac586-f48e-410c-a5bb-589eccbef2c8\" (UID: \"767ac586-f48e-410c-a5bb-589eccbef2c8\") " Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.650634 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "767ac586-f48e-410c-a5bb-589eccbef2c8" (UID: "767ac586-f48e-410c-a5bb-589eccbef2c8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.662999 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/767ac586-f48e-410c-a5bb-589eccbef2c8-kube-api-access-z68dt" (OuterVolumeSpecName: "kube-api-access-z68dt") pod "767ac586-f48e-410c-a5bb-589eccbef2c8" (UID: "767ac586-f48e-410c-a5bb-589eccbef2c8"). InnerVolumeSpecName "kube-api-access-z68dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.673495 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "767ac586-f48e-410c-a5bb-589eccbef2c8" (UID: "767ac586-f48e-410c-a5bb-589eccbef2c8"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.680380 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "767ac586-f48e-410c-a5bb-589eccbef2c8" (UID: "767ac586-f48e-410c-a5bb-589eccbef2c8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.691162 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-inventory" (OuterVolumeSpecName: "inventory") pod "767ac586-f48e-410c-a5bb-589eccbef2c8" (UID: "767ac586-f48e-410c-a5bb-589eccbef2c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.746369 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z68dt\" (UniqueName: \"kubernetes.io/projected/767ac586-f48e-410c-a5bb-589eccbef2c8-kube-api-access-z68dt\") on node \"crc\" DevicePath \"\"" Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.746406 4715 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.746429 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.746475 4715 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:24:57 crc kubenswrapper[4715]: I1009 08:24:57.746483 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/767ac586-f48e-410c-a5bb-589eccbef2c8-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.052898 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" event={"ID":"767ac586-f48e-410c-a5bb-589eccbef2c8","Type":"ContainerDied","Data":"a207504d761a2cfc989318d196d3489acac9c85dec6864c524cc6d57728bd916"} Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.052941 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a207504d761a2cfc989318d196d3489acac9c85dec6864c524cc6d57728bd916" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.052964 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-whz64" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.148943 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7"] Oct 09 08:24:58 crc kubenswrapper[4715]: E1009 08:24:58.149280 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40273519-8a0a-441d-a416-78e0c824eb04" containerName="extract-utilities" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.149306 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="40273519-8a0a-441d-a416-78e0c824eb04" containerName="extract-utilities" Oct 09 08:24:58 crc kubenswrapper[4715]: E1009 08:24:58.149323 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4c8ed2-08c6-42f7-a19d-226fc98143f6" containerName="extract-utilities" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.149330 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4c8ed2-08c6-42f7-a19d-226fc98143f6" containerName="extract-utilities" Oct 09 08:24:58 crc kubenswrapper[4715]: E1009 08:24:58.149339 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40273519-8a0a-441d-a416-78e0c824eb04" containerName="extract-content" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.149345 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="40273519-8a0a-441d-a416-78e0c824eb04" containerName="extract-content" Oct 09 08:24:58 crc kubenswrapper[4715]: E1009 08:24:58.149353 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40273519-8a0a-441d-a416-78e0c824eb04" containerName="registry-server" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.149360 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="40273519-8a0a-441d-a416-78e0c824eb04" containerName="registry-server" Oct 09 08:24:58 crc kubenswrapper[4715]: E1009 08:24:58.149381 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4c8ed2-08c6-42f7-a19d-226fc98143f6" containerName="registry-server" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.149387 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4c8ed2-08c6-42f7-a19d-226fc98143f6" containerName="registry-server" Oct 09 08:24:58 crc kubenswrapper[4715]: E1009 08:24:58.149409 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767ac586-f48e-410c-a5bb-589eccbef2c8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.149456 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="767ac586-f48e-410c-a5bb-589eccbef2c8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 08:24:58 crc kubenswrapper[4715]: E1009 08:24:58.149473 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4c8ed2-08c6-42f7-a19d-226fc98143f6" containerName="extract-content" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.149478 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4c8ed2-08c6-42f7-a19d-226fc98143f6" containerName="extract-content" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.149649 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc4c8ed2-08c6-42f7-a19d-226fc98143f6" containerName="registry-server" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.149658 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="40273519-8a0a-441d-a416-78e0c824eb04" containerName="registry-server" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.149671 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="767ac586-f48e-410c-a5bb-589eccbef2c8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.150335 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.153051 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.153150 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.153513 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.154781 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.161549 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7"] Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.168533 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.168769 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.172866 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.253983 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.254036 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.254214 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.254445 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.254496 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dznzg\" (UniqueName: \"kubernetes.io/projected/7de82685-bfc0-41e5-81db-cadda0dc8d65-kube-api-access-dznzg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.254638 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.254716 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.254764 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.254835 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.357583 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.357717 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.357778 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.357881 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.357990 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.358039 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.358082 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.358146 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.358178 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dznzg\" (UniqueName: \"kubernetes.io/projected/7de82685-bfc0-41e5-81db-cadda0dc8d65-kube-api-access-dznzg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.359869 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.364395 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.364580 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.364853 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.365462 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.367389 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.367786 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.369394 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.385902 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dznzg\" (UniqueName: \"kubernetes.io/projected/7de82685-bfc0-41e5-81db-cadda0dc8d65-kube-api-access-dznzg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sb8x7\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:58 crc kubenswrapper[4715]: I1009 08:24:58.467002 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:24:59 crc kubenswrapper[4715]: I1009 08:24:59.010733 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7"] Oct 09 08:24:59 crc kubenswrapper[4715]: W1009 08:24:59.016528 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7de82685_bfc0_41e5_81db_cadda0dc8d65.slice/crio-0039c767d2b9bb98129597b10dea499e4bd44df371dedb310ec858f28cfae8f8 WatchSource:0}: Error finding container 0039c767d2b9bb98129597b10dea499e4bd44df371dedb310ec858f28cfae8f8: Status 404 returned error can't find the container with id 0039c767d2b9bb98129597b10dea499e4bd44df371dedb310ec858f28cfae8f8 Oct 09 08:24:59 crc kubenswrapper[4715]: I1009 08:24:59.019478 4715 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 08:24:59 crc kubenswrapper[4715]: I1009 08:24:59.062310 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" event={"ID":"7de82685-bfc0-41e5-81db-cadda0dc8d65","Type":"ContainerStarted","Data":"0039c767d2b9bb98129597b10dea499e4bd44df371dedb310ec858f28cfae8f8"} Oct 09 08:25:00 crc kubenswrapper[4715]: I1009 08:25:00.078410 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" event={"ID":"7de82685-bfc0-41e5-81db-cadda0dc8d65","Type":"ContainerStarted","Data":"7aa25e16d70122d165e641ea0805e26ced4a4ab2fd12b57b2f19057499bce748"} Oct 09 08:25:00 crc kubenswrapper[4715]: I1009 08:25:00.106609 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" podStartSLOduration=1.701174021 podStartE2EDuration="2.106588473s" podCreationTimestamp="2025-10-09 08:24:58 +0000 UTC" firstStartedPulling="2025-10-09 08:24:59.019169833 +0000 UTC m=+2329.711973841" lastFinishedPulling="2025-10-09 08:24:59.424584285 +0000 UTC m=+2330.117388293" observedRunningTime="2025-10-09 08:25:00.098891693 +0000 UTC m=+2330.791695711" watchObservedRunningTime="2025-10-09 08:25:00.106588473 +0000 UTC m=+2330.799392481" Oct 09 08:25:07 crc kubenswrapper[4715]: E1009 08:25:07.975088 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Oct 09 08:25:16 crc kubenswrapper[4715]: I1009 08:25:16.753867 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:25:16 crc kubenswrapper[4715]: I1009 08:25:16.754328 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:25:46 crc kubenswrapper[4715]: I1009 08:25:46.753935 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:25:46 crc kubenswrapper[4715]: I1009 08:25:46.754539 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:25:46 crc kubenswrapper[4715]: I1009 08:25:46.754589 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 08:25:46 crc kubenswrapper[4715]: I1009 08:25:46.755559 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 08:25:46 crc kubenswrapper[4715]: I1009 08:25:46.755639 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" gracePeriod=600 Oct 09 08:25:46 crc kubenswrapper[4715]: E1009 08:25:46.890144 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:25:47 crc kubenswrapper[4715]: I1009 08:25:47.526706 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" exitCode=0 Oct 09 08:25:47 crc kubenswrapper[4715]: I1009 08:25:47.526790 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28"} Oct 09 08:25:47 crc kubenswrapper[4715]: I1009 08:25:47.527092 4715 scope.go:117] "RemoveContainer" containerID="015af87f013fa22d235ca7ed867a3852e276c3f820b05ec66959a6cef5662490" Oct 09 08:25:47 crc kubenswrapper[4715]: I1009 08:25:47.527787 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:25:47 crc kubenswrapper[4715]: E1009 08:25:47.528090 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:25:59 crc kubenswrapper[4715]: I1009 08:25:59.136893 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:25:59 crc kubenswrapper[4715]: E1009 08:25:59.137587 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:26:14 crc kubenswrapper[4715]: I1009 08:26:14.136816 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:26:14 crc kubenswrapper[4715]: E1009 08:26:14.137677 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:26:28 crc kubenswrapper[4715]: I1009 08:26:28.137045 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:26:28 crc kubenswrapper[4715]: E1009 08:26:28.137861 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:26:43 crc kubenswrapper[4715]: I1009 08:26:43.137537 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:26:43 crc kubenswrapper[4715]: E1009 08:26:43.138391 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:26:56 crc kubenswrapper[4715]: I1009 08:26:56.137014 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:26:56 crc kubenswrapper[4715]: E1009 08:26:56.138017 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:27:07 crc kubenswrapper[4715]: I1009 08:27:07.136841 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:27:07 crc kubenswrapper[4715]: E1009 08:27:07.137584 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:27:19 crc kubenswrapper[4715]: I1009 08:27:19.137462 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:27:19 crc kubenswrapper[4715]: E1009 08:27:19.138562 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:27:32 crc kubenswrapper[4715]: I1009 08:27:32.136877 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:27:32 crc kubenswrapper[4715]: E1009 08:27:32.137908 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:27:43 crc kubenswrapper[4715]: I1009 08:27:43.137891 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:27:43 crc kubenswrapper[4715]: E1009 08:27:43.138623 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:27:56 crc kubenswrapper[4715]: I1009 08:27:56.137185 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:27:56 crc kubenswrapper[4715]: E1009 08:27:56.139762 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:28:10 crc kubenswrapper[4715]: I1009 08:28:10.938715 4715 generic.go:334] "Generic (PLEG): container finished" podID="7de82685-bfc0-41e5-81db-cadda0dc8d65" containerID="7aa25e16d70122d165e641ea0805e26ced4a4ab2fd12b57b2f19057499bce748" exitCode=0 Oct 09 08:28:10 crc kubenswrapper[4715]: I1009 08:28:10.938837 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" event={"ID":"7de82685-bfc0-41e5-81db-cadda0dc8d65","Type":"ContainerDied","Data":"7aa25e16d70122d165e641ea0805e26ced4a4ab2fd12b57b2f19057499bce748"} Oct 09 08:28:11 crc kubenswrapper[4715]: I1009 08:28:11.136643 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:28:11 crc kubenswrapper[4715]: E1009 08:28:11.137304 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.356067 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.441792 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dznzg\" (UniqueName: \"kubernetes.io/projected/7de82685-bfc0-41e5-81db-cadda0dc8d65-kube-api-access-dznzg\") pod \"7de82685-bfc0-41e5-81db-cadda0dc8d65\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.441889 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-migration-ssh-key-0\") pod \"7de82685-bfc0-41e5-81db-cadda0dc8d65\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.442025 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-inventory\") pod \"7de82685-bfc0-41e5-81db-cadda0dc8d65\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.442083 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-cell1-compute-config-1\") pod \"7de82685-bfc0-41e5-81db-cadda0dc8d65\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.442126 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-cell1-compute-config-0\") pod \"7de82685-bfc0-41e5-81db-cadda0dc8d65\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.442166 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-migration-ssh-key-1\") pod \"7de82685-bfc0-41e5-81db-cadda0dc8d65\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.442749 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-extra-config-0\") pod \"7de82685-bfc0-41e5-81db-cadda0dc8d65\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.442809 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-ssh-key\") pod \"7de82685-bfc0-41e5-81db-cadda0dc8d65\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.442834 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-combined-ca-bundle\") pod \"7de82685-bfc0-41e5-81db-cadda0dc8d65\" (UID: \"7de82685-bfc0-41e5-81db-cadda0dc8d65\") " Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.448268 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de82685-bfc0-41e5-81db-cadda0dc8d65-kube-api-access-dznzg" (OuterVolumeSpecName: "kube-api-access-dznzg") pod "7de82685-bfc0-41e5-81db-cadda0dc8d65" (UID: "7de82685-bfc0-41e5-81db-cadda0dc8d65"). InnerVolumeSpecName "kube-api-access-dznzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.450934 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7de82685-bfc0-41e5-81db-cadda0dc8d65" (UID: "7de82685-bfc0-41e5-81db-cadda0dc8d65"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.469142 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "7de82685-bfc0-41e5-81db-cadda0dc8d65" (UID: "7de82685-bfc0-41e5-81db-cadda0dc8d65"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.471229 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7de82685-bfc0-41e5-81db-cadda0dc8d65" (UID: "7de82685-bfc0-41e5-81db-cadda0dc8d65"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.472298 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-inventory" (OuterVolumeSpecName: "inventory") pod "7de82685-bfc0-41e5-81db-cadda0dc8d65" (UID: "7de82685-bfc0-41e5-81db-cadda0dc8d65"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.473561 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7de82685-bfc0-41e5-81db-cadda0dc8d65" (UID: "7de82685-bfc0-41e5-81db-cadda0dc8d65"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.478376 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7de82685-bfc0-41e5-81db-cadda0dc8d65" (UID: "7de82685-bfc0-41e5-81db-cadda0dc8d65"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.479890 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7de82685-bfc0-41e5-81db-cadda0dc8d65" (UID: "7de82685-bfc0-41e5-81db-cadda0dc8d65"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.487134 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7de82685-bfc0-41e5-81db-cadda0dc8d65" (UID: "7de82685-bfc0-41e5-81db-cadda0dc8d65"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.544675 4715 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.544716 4715 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.544727 4715 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.544736 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.544745 4715 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.544754 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dznzg\" (UniqueName: \"kubernetes.io/projected/7de82685-bfc0-41e5-81db-cadda0dc8d65-kube-api-access-dznzg\") on node \"crc\" DevicePath \"\"" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.544762 4715 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.544771 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.544780 4715 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7de82685-bfc0-41e5-81db-cadda0dc8d65-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.956572 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" event={"ID":"7de82685-bfc0-41e5-81db-cadda0dc8d65","Type":"ContainerDied","Data":"0039c767d2b9bb98129597b10dea499e4bd44df371dedb310ec858f28cfae8f8"} Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.956878 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0039c767d2b9bb98129597b10dea499e4bd44df371dedb310ec858f28cfae8f8" Oct 09 08:28:12 crc kubenswrapper[4715]: I1009 08:28:12.956667 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sb8x7" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.073130 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4"] Oct 09 08:28:13 crc kubenswrapper[4715]: E1009 08:28:13.073580 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de82685-bfc0-41e5-81db-cadda0dc8d65" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.073602 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de82685-bfc0-41e5-81db-cadda0dc8d65" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.073861 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de82685-bfc0-41e5-81db-cadda0dc8d65" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.074611 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.076982 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.077041 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-47fjb" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.078364 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.078836 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.078876 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.091048 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4"] Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.154191 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.154242 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.154332 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg4cj\" (UniqueName: \"kubernetes.io/projected/943912d2-23f1-4cc8-92ab-42288a195416-kube-api-access-qg4cj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.154538 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.154683 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.154762 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.154851 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.256043 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.256134 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.256156 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.256203 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg4cj\" (UniqueName: \"kubernetes.io/projected/943912d2-23f1-4cc8-92ab-42288a195416-kube-api-access-qg4cj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.256236 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.256255 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.256276 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.260761 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.261145 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.261911 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.262035 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.263263 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.265012 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.278860 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg4cj\" (UniqueName: \"kubernetes.io/projected/943912d2-23f1-4cc8-92ab-42288a195416-kube-api-access-qg4cj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.397535 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.922522 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4"] Oct 09 08:28:13 crc kubenswrapper[4715]: I1009 08:28:13.966855 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" event={"ID":"943912d2-23f1-4cc8-92ab-42288a195416","Type":"ContainerStarted","Data":"91057c4a192b9f16f2d0f41a16dcda4f0e40f82d60366cf33dd601da86031c68"} Oct 09 08:28:14 crc kubenswrapper[4715]: I1009 08:28:14.986405 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" event={"ID":"943912d2-23f1-4cc8-92ab-42288a195416","Type":"ContainerStarted","Data":"9271cbc646b91acf380551455400d179ae6387eea944f843aea1790b19e49529"} Oct 09 08:28:15 crc kubenswrapper[4715]: I1009 08:28:15.015033 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" podStartSLOduration=1.4790385879999999 podStartE2EDuration="2.015009632s" podCreationTimestamp="2025-10-09 08:28:13 +0000 UTC" firstStartedPulling="2025-10-09 08:28:13.918672545 +0000 UTC m=+2524.611476563" lastFinishedPulling="2025-10-09 08:28:14.454643599 +0000 UTC m=+2525.147447607" observedRunningTime="2025-10-09 08:28:15.003527953 +0000 UTC m=+2525.696331971" watchObservedRunningTime="2025-10-09 08:28:15.015009632 +0000 UTC m=+2525.707813650" Oct 09 08:28:24 crc kubenswrapper[4715]: I1009 08:28:24.137947 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:28:24 crc kubenswrapper[4715]: E1009 08:28:24.138720 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:28:38 crc kubenswrapper[4715]: I1009 08:28:38.137140 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:28:38 crc kubenswrapper[4715]: E1009 08:28:38.137900 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:28:50 crc kubenswrapper[4715]: I1009 08:28:50.148302 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:28:50 crc kubenswrapper[4715]: E1009 08:28:50.149067 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:29:02 crc kubenswrapper[4715]: I1009 08:29:02.137559 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:29:02 crc kubenswrapper[4715]: E1009 08:29:02.138741 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:29:17 crc kubenswrapper[4715]: I1009 08:29:17.137515 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:29:17 crc kubenswrapper[4715]: E1009 08:29:17.138395 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:29:30 crc kubenswrapper[4715]: I1009 08:29:30.146368 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:29:30 crc kubenswrapper[4715]: E1009 08:29:30.147222 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:29:45 crc kubenswrapper[4715]: I1009 08:29:45.137112 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:29:45 crc kubenswrapper[4715]: E1009 08:29:45.138550 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:29:56 crc kubenswrapper[4715]: I1009 08:29:56.139346 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:29:56 crc kubenswrapper[4715]: E1009 08:29:56.140258 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.153521 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv"] Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.155694 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.159202 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.159459 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.170200 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv"] Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.324821 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pgcr\" (UniqueName: \"kubernetes.io/projected/cdeed42c-b6fb-439b-92f5-f51b811caa38-kube-api-access-6pgcr\") pod \"collect-profiles-29333310-n44wv\" (UID: \"cdeed42c-b6fb-439b-92f5-f51b811caa38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.324879 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cdeed42c-b6fb-439b-92f5-f51b811caa38-secret-volume\") pod \"collect-profiles-29333310-n44wv\" (UID: \"cdeed42c-b6fb-439b-92f5-f51b811caa38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.325109 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cdeed42c-b6fb-439b-92f5-f51b811caa38-config-volume\") pod \"collect-profiles-29333310-n44wv\" (UID: \"cdeed42c-b6fb-439b-92f5-f51b811caa38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.427118 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cdeed42c-b6fb-439b-92f5-f51b811caa38-config-volume\") pod \"collect-profiles-29333310-n44wv\" (UID: \"cdeed42c-b6fb-439b-92f5-f51b811caa38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.427208 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pgcr\" (UniqueName: \"kubernetes.io/projected/cdeed42c-b6fb-439b-92f5-f51b811caa38-kube-api-access-6pgcr\") pod \"collect-profiles-29333310-n44wv\" (UID: \"cdeed42c-b6fb-439b-92f5-f51b811caa38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.427243 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cdeed42c-b6fb-439b-92f5-f51b811caa38-secret-volume\") pod \"collect-profiles-29333310-n44wv\" (UID: \"cdeed42c-b6fb-439b-92f5-f51b811caa38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.428060 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cdeed42c-b6fb-439b-92f5-f51b811caa38-config-volume\") pod \"collect-profiles-29333310-n44wv\" (UID: \"cdeed42c-b6fb-439b-92f5-f51b811caa38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.434078 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cdeed42c-b6fb-439b-92f5-f51b811caa38-secret-volume\") pod \"collect-profiles-29333310-n44wv\" (UID: \"cdeed42c-b6fb-439b-92f5-f51b811caa38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.446190 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pgcr\" (UniqueName: \"kubernetes.io/projected/cdeed42c-b6fb-439b-92f5-f51b811caa38-kube-api-access-6pgcr\") pod \"collect-profiles-29333310-n44wv\" (UID: \"cdeed42c-b6fb-439b-92f5-f51b811caa38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.483753 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:00 crc kubenswrapper[4715]: I1009 08:30:00.943572 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv"] Oct 09 08:30:01 crc kubenswrapper[4715]: I1009 08:30:01.036799 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" event={"ID":"cdeed42c-b6fb-439b-92f5-f51b811caa38","Type":"ContainerStarted","Data":"cfe7a8dce399146e41191136273915d4573d4b0bdfc2251d08e9ba03d0638a0d"} Oct 09 08:30:02 crc kubenswrapper[4715]: I1009 08:30:02.050516 4715 generic.go:334] "Generic (PLEG): container finished" podID="cdeed42c-b6fb-439b-92f5-f51b811caa38" containerID="a604c15ba374b2179b73b94a6f0c17c0bc0628714c19078746f65ea5a475c161" exitCode=0 Oct 09 08:30:02 crc kubenswrapper[4715]: I1009 08:30:02.050612 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" event={"ID":"cdeed42c-b6fb-439b-92f5-f51b811caa38","Type":"ContainerDied","Data":"a604c15ba374b2179b73b94a6f0c17c0bc0628714c19078746f65ea5a475c161"} Oct 09 08:30:03 crc kubenswrapper[4715]: I1009 08:30:03.399346 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:03 crc kubenswrapper[4715]: I1009 08:30:03.486559 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cdeed42c-b6fb-439b-92f5-f51b811caa38-secret-volume\") pod \"cdeed42c-b6fb-439b-92f5-f51b811caa38\" (UID: \"cdeed42c-b6fb-439b-92f5-f51b811caa38\") " Oct 09 08:30:03 crc kubenswrapper[4715]: I1009 08:30:03.486656 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cdeed42c-b6fb-439b-92f5-f51b811caa38-config-volume\") pod \"cdeed42c-b6fb-439b-92f5-f51b811caa38\" (UID: \"cdeed42c-b6fb-439b-92f5-f51b811caa38\") " Oct 09 08:30:03 crc kubenswrapper[4715]: I1009 08:30:03.486793 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pgcr\" (UniqueName: \"kubernetes.io/projected/cdeed42c-b6fb-439b-92f5-f51b811caa38-kube-api-access-6pgcr\") pod \"cdeed42c-b6fb-439b-92f5-f51b811caa38\" (UID: \"cdeed42c-b6fb-439b-92f5-f51b811caa38\") " Oct 09 08:30:03 crc kubenswrapper[4715]: I1009 08:30:03.488005 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdeed42c-b6fb-439b-92f5-f51b811caa38-config-volume" (OuterVolumeSpecName: "config-volume") pod "cdeed42c-b6fb-439b-92f5-f51b811caa38" (UID: "cdeed42c-b6fb-439b-92f5-f51b811caa38"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:30:03 crc kubenswrapper[4715]: I1009 08:30:03.493285 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdeed42c-b6fb-439b-92f5-f51b811caa38-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cdeed42c-b6fb-439b-92f5-f51b811caa38" (UID: "cdeed42c-b6fb-439b-92f5-f51b811caa38"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:30:03 crc kubenswrapper[4715]: I1009 08:30:03.493544 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdeed42c-b6fb-439b-92f5-f51b811caa38-kube-api-access-6pgcr" (OuterVolumeSpecName: "kube-api-access-6pgcr") pod "cdeed42c-b6fb-439b-92f5-f51b811caa38" (UID: "cdeed42c-b6fb-439b-92f5-f51b811caa38"). InnerVolumeSpecName "kube-api-access-6pgcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:30:03 crc kubenswrapper[4715]: I1009 08:30:03.590011 4715 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cdeed42c-b6fb-439b-92f5-f51b811caa38-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 08:30:03 crc kubenswrapper[4715]: I1009 08:30:03.590368 4715 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cdeed42c-b6fb-439b-92f5-f51b811caa38-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 08:30:03 crc kubenswrapper[4715]: I1009 08:30:03.590466 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pgcr\" (UniqueName: \"kubernetes.io/projected/cdeed42c-b6fb-439b-92f5-f51b811caa38-kube-api-access-6pgcr\") on node \"crc\" DevicePath \"\"" Oct 09 08:30:04 crc kubenswrapper[4715]: I1009 08:30:04.068930 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" event={"ID":"cdeed42c-b6fb-439b-92f5-f51b811caa38","Type":"ContainerDied","Data":"cfe7a8dce399146e41191136273915d4573d4b0bdfc2251d08e9ba03d0638a0d"} Oct 09 08:30:04 crc kubenswrapper[4715]: I1009 08:30:04.068966 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfe7a8dce399146e41191136273915d4573d4b0bdfc2251d08e9ba03d0638a0d" Oct 09 08:30:04 crc kubenswrapper[4715]: I1009 08:30:04.069003 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333310-n44wv" Oct 09 08:30:04 crc kubenswrapper[4715]: I1009 08:30:04.470712 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85"] Oct 09 08:30:04 crc kubenswrapper[4715]: I1009 08:30:04.479077 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333265-qxg85"] Oct 09 08:30:06 crc kubenswrapper[4715]: I1009 08:30:06.147699 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5488b64-893e-49f7-9de1-99905faf0d3b" path="/var/lib/kubelet/pods/b5488b64-893e-49f7-9de1-99905faf0d3b/volumes" Oct 09 08:30:11 crc kubenswrapper[4715]: I1009 08:30:11.137073 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:30:11 crc kubenswrapper[4715]: E1009 08:30:11.137961 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:30:14 crc kubenswrapper[4715]: I1009 08:30:14.269063 4715 scope.go:117] "RemoveContainer" containerID="dbdebe7fb6b1868bd19733b1a993714d1e80e69364e9cbbf110eadb09b967c0a" Oct 09 08:30:22 crc kubenswrapper[4715]: I1009 08:30:22.137787 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:30:22 crc kubenswrapper[4715]: E1009 08:30:22.138484 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:30:33 crc kubenswrapper[4715]: I1009 08:30:33.137016 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:30:33 crc kubenswrapper[4715]: E1009 08:30:33.137992 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:30:37 crc kubenswrapper[4715]: I1009 08:30:37.358307 4715 generic.go:334] "Generic (PLEG): container finished" podID="943912d2-23f1-4cc8-92ab-42288a195416" containerID="9271cbc646b91acf380551455400d179ae6387eea944f843aea1790b19e49529" exitCode=0 Oct 09 08:30:37 crc kubenswrapper[4715]: I1009 08:30:37.358780 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" event={"ID":"943912d2-23f1-4cc8-92ab-42288a195416","Type":"ContainerDied","Data":"9271cbc646b91acf380551455400d179ae6387eea944f843aea1790b19e49529"} Oct 09 08:30:38 crc kubenswrapper[4715]: I1009 08:30:38.818051 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:30:38 crc kubenswrapper[4715]: I1009 08:30:38.959803 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-telemetry-combined-ca-bundle\") pod \"943912d2-23f1-4cc8-92ab-42288a195416\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " Oct 09 08:30:38 crc kubenswrapper[4715]: I1009 08:30:38.959891 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg4cj\" (UniqueName: \"kubernetes.io/projected/943912d2-23f1-4cc8-92ab-42288a195416-kube-api-access-qg4cj\") pod \"943912d2-23f1-4cc8-92ab-42288a195416\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " Oct 09 08:30:38 crc kubenswrapper[4715]: I1009 08:30:38.960027 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-2\") pod \"943912d2-23f1-4cc8-92ab-42288a195416\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " Oct 09 08:30:38 crc kubenswrapper[4715]: I1009 08:30:38.960075 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ssh-key\") pod \"943912d2-23f1-4cc8-92ab-42288a195416\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " Oct 09 08:30:38 crc kubenswrapper[4715]: I1009 08:30:38.960219 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-inventory\") pod \"943912d2-23f1-4cc8-92ab-42288a195416\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " Oct 09 08:30:38 crc kubenswrapper[4715]: I1009 08:30:38.960330 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-0\") pod \"943912d2-23f1-4cc8-92ab-42288a195416\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " Oct 09 08:30:38 crc kubenswrapper[4715]: I1009 08:30:38.960364 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-1\") pod \"943912d2-23f1-4cc8-92ab-42288a195416\" (UID: \"943912d2-23f1-4cc8-92ab-42288a195416\") " Oct 09 08:30:38 crc kubenswrapper[4715]: I1009 08:30:38.966636 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943912d2-23f1-4cc8-92ab-42288a195416-kube-api-access-qg4cj" (OuterVolumeSpecName: "kube-api-access-qg4cj") pod "943912d2-23f1-4cc8-92ab-42288a195416" (UID: "943912d2-23f1-4cc8-92ab-42288a195416"). InnerVolumeSpecName "kube-api-access-qg4cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:30:38 crc kubenswrapper[4715]: I1009 08:30:38.967403 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "943912d2-23f1-4cc8-92ab-42288a195416" (UID: "943912d2-23f1-4cc8-92ab-42288a195416"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:30:38 crc kubenswrapper[4715]: I1009 08:30:38.998718 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "943912d2-23f1-4cc8-92ab-42288a195416" (UID: "943912d2-23f1-4cc8-92ab-42288a195416"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.002240 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "943912d2-23f1-4cc8-92ab-42288a195416" (UID: "943912d2-23f1-4cc8-92ab-42288a195416"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.010967 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "943912d2-23f1-4cc8-92ab-42288a195416" (UID: "943912d2-23f1-4cc8-92ab-42288a195416"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.015313 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-inventory" (OuterVolumeSpecName: "inventory") pod "943912d2-23f1-4cc8-92ab-42288a195416" (UID: "943912d2-23f1-4cc8-92ab-42288a195416"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.022135 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "943912d2-23f1-4cc8-92ab-42288a195416" (UID: "943912d2-23f1-4cc8-92ab-42288a195416"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.063180 4715 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.063230 4715 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.063245 4715 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.063259 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg4cj\" (UniqueName: \"kubernetes.io/projected/943912d2-23f1-4cc8-92ab-42288a195416-kube-api-access-qg4cj\") on node \"crc\" DevicePath \"\"" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.063271 4715 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.063284 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.063296 4715 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/943912d2-23f1-4cc8-92ab-42288a195416-inventory\") on node \"crc\" DevicePath \"\"" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.380014 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" event={"ID":"943912d2-23f1-4cc8-92ab-42288a195416","Type":"ContainerDied","Data":"91057c4a192b9f16f2d0f41a16dcda4f0e40f82d60366cf33dd601da86031c68"} Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.380780 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91057c4a192b9f16f2d0f41a16dcda4f0e40f82d60366cf33dd601da86031c68" Oct 09 08:30:39 crc kubenswrapper[4715]: I1009 08:30:39.380539 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4" Oct 09 08:30:47 crc kubenswrapper[4715]: I1009 08:30:47.137719 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:30:47 crc kubenswrapper[4715]: I1009 08:30:47.461299 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"0d3a521bcd0930bef52f3187725336b9565b7b96aa8ea310dc91dc5198d53fb8"} Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.552826 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-chtln"] Oct 09 08:31:25 crc kubenswrapper[4715]: E1009 08:31:25.553862 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943912d2-23f1-4cc8-92ab-42288a195416" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.553879 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="943912d2-23f1-4cc8-92ab-42288a195416" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 08:31:25 crc kubenswrapper[4715]: E1009 08:31:25.553896 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdeed42c-b6fb-439b-92f5-f51b811caa38" containerName="collect-profiles" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.553903 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdeed42c-b6fb-439b-92f5-f51b811caa38" containerName="collect-profiles" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.554113 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="943912d2-23f1-4cc8-92ab-42288a195416" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.554142 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdeed42c-b6fb-439b-92f5-f51b811caa38" containerName="collect-profiles" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.555857 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.589069 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-chtln"] Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.676314 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af6becdc-9c8d-49f2-ae9c-a58edc359d1f-utilities\") pod \"certified-operators-chtln\" (UID: \"af6becdc-9c8d-49f2-ae9c-a58edc359d1f\") " pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.676639 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af6becdc-9c8d-49f2-ae9c-a58edc359d1f-catalog-content\") pod \"certified-operators-chtln\" (UID: \"af6becdc-9c8d-49f2-ae9c-a58edc359d1f\") " pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.676750 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zmwp\" (UniqueName: \"kubernetes.io/projected/af6becdc-9c8d-49f2-ae9c-a58edc359d1f-kube-api-access-2zmwp\") pod \"certified-operators-chtln\" (UID: \"af6becdc-9c8d-49f2-ae9c-a58edc359d1f\") " pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.777831 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af6becdc-9c8d-49f2-ae9c-a58edc359d1f-utilities\") pod \"certified-operators-chtln\" (UID: \"af6becdc-9c8d-49f2-ae9c-a58edc359d1f\") " pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.778276 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af6becdc-9c8d-49f2-ae9c-a58edc359d1f-utilities\") pod \"certified-operators-chtln\" (UID: \"af6becdc-9c8d-49f2-ae9c-a58edc359d1f\") " pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.778361 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af6becdc-9c8d-49f2-ae9c-a58edc359d1f-catalog-content\") pod \"certified-operators-chtln\" (UID: \"af6becdc-9c8d-49f2-ae9c-a58edc359d1f\") " pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.778518 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zmwp\" (UniqueName: \"kubernetes.io/projected/af6becdc-9c8d-49f2-ae9c-a58edc359d1f-kube-api-access-2zmwp\") pod \"certified-operators-chtln\" (UID: \"af6becdc-9c8d-49f2-ae9c-a58edc359d1f\") " pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.778651 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af6becdc-9c8d-49f2-ae9c-a58edc359d1f-catalog-content\") pod \"certified-operators-chtln\" (UID: \"af6becdc-9c8d-49f2-ae9c-a58edc359d1f\") " pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.799274 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zmwp\" (UniqueName: \"kubernetes.io/projected/af6becdc-9c8d-49f2-ae9c-a58edc359d1f-kube-api-access-2zmwp\") pod \"certified-operators-chtln\" (UID: \"af6becdc-9c8d-49f2-ae9c-a58edc359d1f\") " pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:31:25 crc kubenswrapper[4715]: I1009 08:31:25.874951 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.331025 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-chtln"] Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.817536 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.818969 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.821799 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.822111 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.822118 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-z66mg" Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.822237 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.832001 4715 generic.go:334] "Generic (PLEG): container finished" podID="af6becdc-9c8d-49f2-ae9c-a58edc359d1f" containerID="b93857b9714bbaa3a9dd4b23b302d4414ce116bb50f50dbdbf78f4d146fec58e" exitCode=0 Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.832055 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chtln" event={"ID":"af6becdc-9c8d-49f2-ae9c-a58edc359d1f","Type":"ContainerDied","Data":"b93857b9714bbaa3a9dd4b23b302d4414ce116bb50f50dbdbf78f4d146fec58e"} Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.832088 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chtln" event={"ID":"af6becdc-9c8d-49f2-ae9c-a58edc359d1f","Type":"ContainerStarted","Data":"16267e55ae429aec959eb92b7656975b4976102767fadce27d159adc0a772b1f"} Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.833945 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.838384 4715 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.898723 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.899131 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c272fa72-6434-4af1-8e2b-433cc9f619ea-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:26 crc kubenswrapper[4715]: I1009 08:31:26.899187 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c272fa72-6434-4af1-8e2b-433cc9f619ea-config-data\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.000380 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.000468 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.000497 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c272fa72-6434-4af1-8e2b-433cc9f619ea-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.000547 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c272fa72-6434-4af1-8e2b-433cc9f619ea-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.000652 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v8g8\" (UniqueName: \"kubernetes.io/projected/c272fa72-6434-4af1-8e2b-433cc9f619ea-kube-api-access-4v8g8\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.000702 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c272fa72-6434-4af1-8e2b-433cc9f619ea-config-data\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.000755 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c272fa72-6434-4af1-8e2b-433cc9f619ea-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.000788 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.000855 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.002305 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c272fa72-6434-4af1-8e2b-433cc9f619ea-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.002451 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c272fa72-6434-4af1-8e2b-433cc9f619ea-config-data\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.008385 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.102428 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v8g8\" (UniqueName: \"kubernetes.io/projected/c272fa72-6434-4af1-8e2b-433cc9f619ea-kube-api-access-4v8g8\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.102517 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c272fa72-6434-4af1-8e2b-433cc9f619ea-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.102544 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.102588 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.102639 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.102658 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c272fa72-6434-4af1-8e2b-433cc9f619ea-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.103065 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c272fa72-6434-4af1-8e2b-433cc9f619ea-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.103120 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c272fa72-6434-4af1-8e2b-433cc9f619ea-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.103186 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.106717 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.107684 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.121029 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v8g8\" (UniqueName: \"kubernetes.io/projected/c272fa72-6434-4af1-8e2b-433cc9f619ea-kube-api-access-4v8g8\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.146763 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.440479 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 08:31:27 crc kubenswrapper[4715]: I1009 08:31:27.871143 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 09 08:31:27 crc kubenswrapper[4715]: W1009 08:31:27.885187 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc272fa72_6434_4af1_8e2b_433cc9f619ea.slice/crio-3ee2dd16e78180b189ad7487103cae4c4000cb0d818a33b6877994fc50811dbc WatchSource:0}: Error finding container 3ee2dd16e78180b189ad7487103cae4c4000cb0d818a33b6877994fc50811dbc: Status 404 returned error can't find the container with id 3ee2dd16e78180b189ad7487103cae4c4000cb0d818a33b6877994fc50811dbc Oct 09 08:31:28 crc kubenswrapper[4715]: I1009 08:31:28.856556 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c272fa72-6434-4af1-8e2b-433cc9f619ea","Type":"ContainerStarted","Data":"3ee2dd16e78180b189ad7487103cae4c4000cb0d818a33b6877994fc50811dbc"} Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.462885 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lkj8c"] Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.468538 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.490300 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lkj8c"] Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.558623 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-utilities\") pod \"community-operators-lkj8c\" (UID: \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\") " pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.558676 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fs8d\" (UniqueName: \"kubernetes.io/projected/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-kube-api-access-7fs8d\") pod \"community-operators-lkj8c\" (UID: \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\") " pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.558724 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-catalog-content\") pod \"community-operators-lkj8c\" (UID: \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\") " pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.660094 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-catalog-content\") pod \"community-operators-lkj8c\" (UID: \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\") " pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.660260 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-utilities\") pod \"community-operators-lkj8c\" (UID: \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\") " pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.660301 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fs8d\" (UniqueName: \"kubernetes.io/projected/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-kube-api-access-7fs8d\") pod \"community-operators-lkj8c\" (UID: \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\") " pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.661083 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-catalog-content\") pod \"community-operators-lkj8c\" (UID: \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\") " pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.661324 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-utilities\") pod \"community-operators-lkj8c\" (UID: \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\") " pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.686840 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fs8d\" (UniqueName: \"kubernetes.io/projected/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-kube-api-access-7fs8d\") pod \"community-operators-lkj8c\" (UID: \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\") " pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:31:40 crc kubenswrapper[4715]: I1009 08:31:40.820659 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.456203 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lk5dw"] Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.460691 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.483344 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lk5dw"] Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.573198 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-utilities\") pod \"redhat-marketplace-lk5dw\" (UID: \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\") " pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.573258 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-catalog-content\") pod \"redhat-marketplace-lk5dw\" (UID: \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\") " pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.573340 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7xr6\" (UniqueName: \"kubernetes.io/projected/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-kube-api-access-g7xr6\") pod \"redhat-marketplace-lk5dw\" (UID: \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\") " pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.674731 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-utilities\") pod \"redhat-marketplace-lk5dw\" (UID: \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\") " pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.674989 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-catalog-content\") pod \"redhat-marketplace-lk5dw\" (UID: \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\") " pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.675087 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7xr6\" (UniqueName: \"kubernetes.io/projected/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-kube-api-access-g7xr6\") pod \"redhat-marketplace-lk5dw\" (UID: \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\") " pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.688491 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-utilities\") pod \"redhat-marketplace-lk5dw\" (UID: \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\") " pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.688542 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-catalog-content\") pod \"redhat-marketplace-lk5dw\" (UID: \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\") " pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.704755 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7xr6\" (UniqueName: \"kubernetes.io/projected/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-kube-api-access-g7xr6\") pod \"redhat-marketplace-lk5dw\" (UID: \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\") " pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:31:50 crc kubenswrapper[4715]: I1009 08:31:50.793684 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.263659 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-86kfg"] Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.269842 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.273541 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86kfg"] Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.405993 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-utilities\") pod \"redhat-operators-86kfg\" (UID: \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\") " pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.406350 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngtsv\" (UniqueName: \"kubernetes.io/projected/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-kube-api-access-ngtsv\") pod \"redhat-operators-86kfg\" (UID: \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\") " pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.406500 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-catalog-content\") pod \"redhat-operators-86kfg\" (UID: \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\") " pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.508681 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngtsv\" (UniqueName: \"kubernetes.io/projected/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-kube-api-access-ngtsv\") pod \"redhat-operators-86kfg\" (UID: \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\") " pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.508753 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-catalog-content\") pod \"redhat-operators-86kfg\" (UID: \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\") " pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.508827 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-utilities\") pod \"redhat-operators-86kfg\" (UID: \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\") " pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.509342 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-utilities\") pod \"redhat-operators-86kfg\" (UID: \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\") " pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.540905 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-catalog-content\") pod \"redhat-operators-86kfg\" (UID: \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\") " pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.548440 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngtsv\" (UniqueName: \"kubernetes.io/projected/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-kube-api-access-ngtsv\") pod \"redhat-operators-86kfg\" (UID: \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\") " pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:31:52 crc kubenswrapper[4715]: I1009 08:31:52.602306 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:31:58 crc kubenswrapper[4715]: E1009 08:31:58.287219 4715 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 09 08:31:58 crc kubenswrapper[4715]: E1009 08:31:58.287633 4715 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4v8g8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(c272fa72-6434-4af1-8e2b-433cc9f619ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 09 08:31:58 crc kubenswrapper[4715]: E1009 08:31:58.288874 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="c272fa72-6434-4af1-8e2b-433cc9f619ea" Oct 09 08:31:58 crc kubenswrapper[4715]: I1009 08:31:58.663150 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86kfg"] Oct 09 08:31:58 crc kubenswrapper[4715]: W1009 08:31:58.672578 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1d62d9a_da53_47f0_80e9_8cb838bb39cd.slice/crio-5510dcf68b97756fe42c937c712ee9190f94724e99febe4567d96b72f0a610d4 WatchSource:0}: Error finding container 5510dcf68b97756fe42c937c712ee9190f94724e99febe4567d96b72f0a610d4: Status 404 returned error can't find the container with id 5510dcf68b97756fe42c937c712ee9190f94724e99febe4567d96b72f0a610d4 Oct 09 08:31:58 crc kubenswrapper[4715]: I1009 08:31:58.720787 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lk5dw"] Oct 09 08:31:58 crc kubenswrapper[4715]: W1009 08:31:58.722925 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd78bc5_ee35_4e68_9190_d0bf4c3b2a9b.slice/crio-3e1f5b2c61f5dad1201d9b53124f28dca31f6643ac0e7c4dc3bf63845f8ac7d6 WatchSource:0}: Error finding container 3e1f5b2c61f5dad1201d9b53124f28dca31f6643ac0e7c4dc3bf63845f8ac7d6: Status 404 returned error can't find the container with id 3e1f5b2c61f5dad1201d9b53124f28dca31f6643ac0e7c4dc3bf63845f8ac7d6 Oct 09 08:31:58 crc kubenswrapper[4715]: W1009 08:31:58.825601 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bfb47ed_6dce_4b9b_ad71_7f02b31fcf59.slice/crio-6203d4bf04933415c0690aa2e9bac7207c0ebe67bd4f13ff6be827571ba06b8b WatchSource:0}: Error finding container 6203d4bf04933415c0690aa2e9bac7207c0ebe67bd4f13ff6be827571ba06b8b: Status 404 returned error can't find the container with id 6203d4bf04933415c0690aa2e9bac7207c0ebe67bd4f13ff6be827571ba06b8b Oct 09 08:31:58 crc kubenswrapper[4715]: I1009 08:31:58.828014 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lkj8c"] Oct 09 08:31:59 crc kubenswrapper[4715]: I1009 08:31:59.139035 4715 generic.go:334] "Generic (PLEG): container finished" podID="5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" containerID="ecbc7b15d41f510ce22b20d16303a63f67348ea81481c3c07f3ae57b3f07994a" exitCode=0 Oct 09 08:31:59 crc kubenswrapper[4715]: I1009 08:31:59.139133 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lk5dw" event={"ID":"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b","Type":"ContainerDied","Data":"ecbc7b15d41f510ce22b20d16303a63f67348ea81481c3c07f3ae57b3f07994a"} Oct 09 08:31:59 crc kubenswrapper[4715]: I1009 08:31:59.139658 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lk5dw" event={"ID":"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b","Type":"ContainerStarted","Data":"3e1f5b2c61f5dad1201d9b53124f28dca31f6643ac0e7c4dc3bf63845f8ac7d6"} Oct 09 08:31:59 crc kubenswrapper[4715]: I1009 08:31:59.142404 4715 generic.go:334] "Generic (PLEG): container finished" podID="0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" containerID="d25258f4dbc5e3babe3f15f4b82d468cef824d299c77e0a4f0994d66485f367a" exitCode=0 Oct 09 08:31:59 crc kubenswrapper[4715]: I1009 08:31:59.142454 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkj8c" event={"ID":"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59","Type":"ContainerDied","Data":"d25258f4dbc5e3babe3f15f4b82d468cef824d299c77e0a4f0994d66485f367a"} Oct 09 08:31:59 crc kubenswrapper[4715]: I1009 08:31:59.142470 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkj8c" event={"ID":"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59","Type":"ContainerStarted","Data":"6203d4bf04933415c0690aa2e9bac7207c0ebe67bd4f13ff6be827571ba06b8b"} Oct 09 08:31:59 crc kubenswrapper[4715]: I1009 08:31:59.147822 4715 generic.go:334] "Generic (PLEG): container finished" podID="c1d62d9a-da53-47f0-80e9-8cb838bb39cd" containerID="43784b611869201a47034d2b66ab75a3792105b502455fe215518f2b3bc3488c" exitCode=0 Oct 09 08:31:59 crc kubenswrapper[4715]: I1009 08:31:59.147906 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kfg" event={"ID":"c1d62d9a-da53-47f0-80e9-8cb838bb39cd","Type":"ContainerDied","Data":"43784b611869201a47034d2b66ab75a3792105b502455fe215518f2b3bc3488c"} Oct 09 08:31:59 crc kubenswrapper[4715]: I1009 08:31:59.147948 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kfg" event={"ID":"c1d62d9a-da53-47f0-80e9-8cb838bb39cd","Type":"ContainerStarted","Data":"5510dcf68b97756fe42c937c712ee9190f94724e99febe4567d96b72f0a610d4"} Oct 09 08:31:59 crc kubenswrapper[4715]: I1009 08:31:59.151616 4715 generic.go:334] "Generic (PLEG): container finished" podID="af6becdc-9c8d-49f2-ae9c-a58edc359d1f" containerID="9630df3f30da0595cf6f39623f4fed0ba1d462f664cdd8acf0e83ff6f15cc5ad" exitCode=0 Oct 09 08:31:59 crc kubenswrapper[4715]: I1009 08:31:59.152874 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chtln" event={"ID":"af6becdc-9c8d-49f2-ae9c-a58edc359d1f","Type":"ContainerDied","Data":"9630df3f30da0595cf6f39623f4fed0ba1d462f664cdd8acf0e83ff6f15cc5ad"} Oct 09 08:31:59 crc kubenswrapper[4715]: E1009 08:31:59.153508 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="c272fa72-6434-4af1-8e2b-433cc9f619ea" Oct 09 08:32:00 crc kubenswrapper[4715]: I1009 08:32:00.162309 4715 generic.go:334] "Generic (PLEG): container finished" podID="5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" containerID="0f3e422f0dcff8600bfeda1f3ce1a35a18de33e3093cd075ec54b76ae3c9d87f" exitCode=0 Oct 09 08:32:00 crc kubenswrapper[4715]: I1009 08:32:00.162397 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lk5dw" event={"ID":"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b","Type":"ContainerDied","Data":"0f3e422f0dcff8600bfeda1f3ce1a35a18de33e3093cd075ec54b76ae3c9d87f"} Oct 09 08:32:00 crc kubenswrapper[4715]: I1009 08:32:00.166881 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-chtln" event={"ID":"af6becdc-9c8d-49f2-ae9c-a58edc359d1f","Type":"ContainerStarted","Data":"1d35cf0bf7fbb5342b8dbd49a2257bd33c84ea4173f6e179837c957d573bfacc"} Oct 09 08:32:00 crc kubenswrapper[4715]: I1009 08:32:00.201844 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-chtln" podStartSLOduration=2.38117427 podStartE2EDuration="35.201825334s" podCreationTimestamp="2025-10-09 08:31:25 +0000 UTC" firstStartedPulling="2025-10-09 08:31:26.837954628 +0000 UTC m=+2717.530758666" lastFinishedPulling="2025-10-09 08:31:59.658605712 +0000 UTC m=+2750.351409730" observedRunningTime="2025-10-09 08:32:00.19926505 +0000 UTC m=+2750.892069098" watchObservedRunningTime="2025-10-09 08:32:00.201825334 +0000 UTC m=+2750.894629342" Oct 09 08:32:01 crc kubenswrapper[4715]: I1009 08:32:01.179679 4715 generic.go:334] "Generic (PLEG): container finished" podID="0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" containerID="a18dfe2c614159e1d29be39eb864cf37d5025e070738f49da2a75153bdd4abdd" exitCode=0 Oct 09 08:32:01 crc kubenswrapper[4715]: I1009 08:32:01.179780 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkj8c" event={"ID":"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59","Type":"ContainerDied","Data":"a18dfe2c614159e1d29be39eb864cf37d5025e070738f49da2a75153bdd4abdd"} Oct 09 08:32:01 crc kubenswrapper[4715]: I1009 08:32:01.187496 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kfg" event={"ID":"c1d62d9a-da53-47f0-80e9-8cb838bb39cd","Type":"ContainerStarted","Data":"2118a6f2befc6d613af7f26172e8dc6d18c4af49fd5909da557ecb4c6d6c8499"} Oct 09 08:32:03 crc kubenswrapper[4715]: I1009 08:32:03.210940 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lk5dw" event={"ID":"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b","Type":"ContainerStarted","Data":"cfb2ebbafcd5ccb355d156f09b5c73381631cc3342cde9a1d6fbf56cc7f6b3c2"} Oct 09 08:32:03 crc kubenswrapper[4715]: I1009 08:32:03.215141 4715 generic.go:334] "Generic (PLEG): container finished" podID="c1d62d9a-da53-47f0-80e9-8cb838bb39cd" containerID="2118a6f2befc6d613af7f26172e8dc6d18c4af49fd5909da557ecb4c6d6c8499" exitCode=0 Oct 09 08:32:03 crc kubenswrapper[4715]: I1009 08:32:03.215187 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kfg" event={"ID":"c1d62d9a-da53-47f0-80e9-8cb838bb39cd","Type":"ContainerDied","Data":"2118a6f2befc6d613af7f26172e8dc6d18c4af49fd5909da557ecb4c6d6c8499"} Oct 09 08:32:03 crc kubenswrapper[4715]: I1009 08:32:03.232257 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lk5dw" podStartSLOduration=10.019124196 podStartE2EDuration="13.2322406s" podCreationTimestamp="2025-10-09 08:31:50 +0000 UTC" firstStartedPulling="2025-10-09 08:31:59.141530002 +0000 UTC m=+2749.834334010" lastFinishedPulling="2025-10-09 08:32:02.354646396 +0000 UTC m=+2753.047450414" observedRunningTime="2025-10-09 08:32:03.230948242 +0000 UTC m=+2753.923752250" watchObservedRunningTime="2025-10-09 08:32:03.2322406 +0000 UTC m=+2753.925044608" Oct 09 08:32:04 crc kubenswrapper[4715]: I1009 08:32:04.231010 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkj8c" event={"ID":"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59","Type":"ContainerStarted","Data":"189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c"} Oct 09 08:32:04 crc kubenswrapper[4715]: I1009 08:32:04.276579 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lkj8c" podStartSLOduration=20.461565068 podStartE2EDuration="24.276549342s" podCreationTimestamp="2025-10-09 08:31:40 +0000 UTC" firstStartedPulling="2025-10-09 08:31:59.143674643 +0000 UTC m=+2749.836478651" lastFinishedPulling="2025-10-09 08:32:02.958658917 +0000 UTC m=+2753.651462925" observedRunningTime="2025-10-09 08:32:04.263772194 +0000 UTC m=+2754.956576232" watchObservedRunningTime="2025-10-09 08:32:04.276549342 +0000 UTC m=+2754.969353380" Oct 09 08:32:05 crc kubenswrapper[4715]: I1009 08:32:05.875039 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:32:05 crc kubenswrapper[4715]: I1009 08:32:05.875494 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:32:05 crc kubenswrapper[4715]: I1009 08:32:05.922903 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:32:06 crc kubenswrapper[4715]: I1009 08:32:06.256591 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kfg" event={"ID":"c1d62d9a-da53-47f0-80e9-8cb838bb39cd","Type":"ContainerStarted","Data":"57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92"} Oct 09 08:32:06 crc kubenswrapper[4715]: I1009 08:32:06.286999 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-86kfg" podStartSLOduration=7.775030321 podStartE2EDuration="14.286978345s" podCreationTimestamp="2025-10-09 08:31:52 +0000 UTC" firstStartedPulling="2025-10-09 08:31:59.149308486 +0000 UTC m=+2749.842112494" lastFinishedPulling="2025-10-09 08:32:05.66125651 +0000 UTC m=+2756.354060518" observedRunningTime="2025-10-09 08:32:06.276008539 +0000 UTC m=+2756.968812567" watchObservedRunningTime="2025-10-09 08:32:06.286978345 +0000 UTC m=+2756.979782363" Oct 09 08:32:06 crc kubenswrapper[4715]: I1009 08:32:06.323537 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-chtln" Oct 09 08:32:09 crc kubenswrapper[4715]: I1009 08:32:09.099916 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-chtln"] Oct 09 08:32:09 crc kubenswrapper[4715]: I1009 08:32:09.447955 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-86kz5"] Oct 09 08:32:09 crc kubenswrapper[4715]: I1009 08:32:09.448483 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-86kz5" podUID="944f4e67-23a1-4024-a5c2-180f17dea29c" containerName="registry-server" containerID="cri-o://0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e" gracePeriod=2 Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.029982 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86kz5" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.058726 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944f4e67-23a1-4024-a5c2-180f17dea29c-utilities\") pod \"944f4e67-23a1-4024-a5c2-180f17dea29c\" (UID: \"944f4e67-23a1-4024-a5c2-180f17dea29c\") " Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.058886 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944f4e67-23a1-4024-a5c2-180f17dea29c-catalog-content\") pod \"944f4e67-23a1-4024-a5c2-180f17dea29c\" (UID: \"944f4e67-23a1-4024-a5c2-180f17dea29c\") " Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.058968 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr5ks\" (UniqueName: \"kubernetes.io/projected/944f4e67-23a1-4024-a5c2-180f17dea29c-kube-api-access-lr5ks\") pod \"944f4e67-23a1-4024-a5c2-180f17dea29c\" (UID: \"944f4e67-23a1-4024-a5c2-180f17dea29c\") " Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.067963 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/944f4e67-23a1-4024-a5c2-180f17dea29c-utilities" (OuterVolumeSpecName: "utilities") pod "944f4e67-23a1-4024-a5c2-180f17dea29c" (UID: "944f4e67-23a1-4024-a5c2-180f17dea29c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.072634 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/944f4e67-23a1-4024-a5c2-180f17dea29c-kube-api-access-lr5ks" (OuterVolumeSpecName: "kube-api-access-lr5ks") pod "944f4e67-23a1-4024-a5c2-180f17dea29c" (UID: "944f4e67-23a1-4024-a5c2-180f17dea29c"). InnerVolumeSpecName "kube-api-access-lr5ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.103929 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/944f4e67-23a1-4024-a5c2-180f17dea29c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "944f4e67-23a1-4024-a5c2-180f17dea29c" (UID: "944f4e67-23a1-4024-a5c2-180f17dea29c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.160822 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/944f4e67-23a1-4024-a5c2-180f17dea29c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.161070 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr5ks\" (UniqueName: \"kubernetes.io/projected/944f4e67-23a1-4024-a5c2-180f17dea29c-kube-api-access-lr5ks\") on node \"crc\" DevicePath \"\"" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.161139 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/944f4e67-23a1-4024-a5c2-180f17dea29c-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.324854 4715 generic.go:334] "Generic (PLEG): container finished" podID="944f4e67-23a1-4024-a5c2-180f17dea29c" containerID="0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e" exitCode=0 Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.324895 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86kz5" event={"ID":"944f4e67-23a1-4024-a5c2-180f17dea29c","Type":"ContainerDied","Data":"0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e"} Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.324922 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-86kz5" event={"ID":"944f4e67-23a1-4024-a5c2-180f17dea29c","Type":"ContainerDied","Data":"2ec7b4979f4f0f1c4ba2801b0c36dc86bcd3db9b8bf16d1a62d1597e283d2c72"} Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.324928 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-86kz5" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.324939 4715 scope.go:117] "RemoveContainer" containerID="0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.353909 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-86kz5"] Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.354841 4715 scope.go:117] "RemoveContainer" containerID="a0ef7662c49030a85f868b105fa072cbe092e359f381ff693d403884cdd9bfb1" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.369386 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-86kz5"] Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.379925 4715 scope.go:117] "RemoveContainer" containerID="ecaa3d2f8b3604e4635ce0ac9c8517076073357ae4d0b4dabfdae9c4ad8eda60" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.421455 4715 scope.go:117] "RemoveContainer" containerID="0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e" Oct 09 08:32:10 crc kubenswrapper[4715]: E1009 08:32:10.421960 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e\": container with ID starting with 0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e not found: ID does not exist" containerID="0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.421996 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e"} err="failed to get container status \"0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e\": rpc error: code = NotFound desc = could not find container \"0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e\": container with ID starting with 0a88e0c4892ff3ca11c179f9744e78563a89bbe6e22f2a920618fe54e40cbc8e not found: ID does not exist" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.422019 4715 scope.go:117] "RemoveContainer" containerID="a0ef7662c49030a85f868b105fa072cbe092e359f381ff693d403884cdd9bfb1" Oct 09 08:32:10 crc kubenswrapper[4715]: E1009 08:32:10.422325 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0ef7662c49030a85f868b105fa072cbe092e359f381ff693d403884cdd9bfb1\": container with ID starting with a0ef7662c49030a85f868b105fa072cbe092e359f381ff693d403884cdd9bfb1 not found: ID does not exist" containerID="a0ef7662c49030a85f868b105fa072cbe092e359f381ff693d403884cdd9bfb1" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.422359 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0ef7662c49030a85f868b105fa072cbe092e359f381ff693d403884cdd9bfb1"} err="failed to get container status \"a0ef7662c49030a85f868b105fa072cbe092e359f381ff693d403884cdd9bfb1\": rpc error: code = NotFound desc = could not find container \"a0ef7662c49030a85f868b105fa072cbe092e359f381ff693d403884cdd9bfb1\": container with ID starting with a0ef7662c49030a85f868b105fa072cbe092e359f381ff693d403884cdd9bfb1 not found: ID does not exist" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.422383 4715 scope.go:117] "RemoveContainer" containerID="ecaa3d2f8b3604e4635ce0ac9c8517076073357ae4d0b4dabfdae9c4ad8eda60" Oct 09 08:32:10 crc kubenswrapper[4715]: E1009 08:32:10.422858 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecaa3d2f8b3604e4635ce0ac9c8517076073357ae4d0b4dabfdae9c4ad8eda60\": container with ID starting with ecaa3d2f8b3604e4635ce0ac9c8517076073357ae4d0b4dabfdae9c4ad8eda60 not found: ID does not exist" containerID="ecaa3d2f8b3604e4635ce0ac9c8517076073357ae4d0b4dabfdae9c4ad8eda60" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.422886 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecaa3d2f8b3604e4635ce0ac9c8517076073357ae4d0b4dabfdae9c4ad8eda60"} err="failed to get container status \"ecaa3d2f8b3604e4635ce0ac9c8517076073357ae4d0b4dabfdae9c4ad8eda60\": rpc error: code = NotFound desc = could not find container \"ecaa3d2f8b3604e4635ce0ac9c8517076073357ae4d0b4dabfdae9c4ad8eda60\": container with ID starting with ecaa3d2f8b3604e4635ce0ac9c8517076073357ae4d0b4dabfdae9c4ad8eda60 not found: ID does not exist" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.794593 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.794824 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.821561 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.821660 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.850220 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:32:10 crc kubenswrapper[4715]: I1009 08:32:10.911681 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:32:11 crc kubenswrapper[4715]: I1009 08:32:11.384826 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:32:11 crc kubenswrapper[4715]: I1009 08:32:11.397995 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:32:12 crc kubenswrapper[4715]: I1009 08:32:12.156103 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="944f4e67-23a1-4024-a5c2-180f17dea29c" path="/var/lib/kubelet/pods/944f4e67-23a1-4024-a5c2-180f17dea29c/volumes" Oct 09 08:32:12 crc kubenswrapper[4715]: I1009 08:32:12.602534 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:32:12 crc kubenswrapper[4715]: I1009 08:32:12.602806 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:32:12 crc kubenswrapper[4715]: I1009 08:32:12.667579 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:32:12 crc kubenswrapper[4715]: I1009 08:32:12.852751 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lkj8c"] Oct 09 08:32:13 crc kubenswrapper[4715]: I1009 08:32:13.402265 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:32:13 crc kubenswrapper[4715]: I1009 08:32:13.849317 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86kfg"] Oct 09 08:32:14 crc kubenswrapper[4715]: I1009 08:32:14.366888 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lkj8c" podUID="0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" containerName="registry-server" containerID="cri-o://189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c" gracePeriod=2 Oct 09 08:32:14 crc kubenswrapper[4715]: I1009 08:32:14.895606 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.038043 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.075645 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fs8d\" (UniqueName: \"kubernetes.io/projected/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-kube-api-access-7fs8d\") pod \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\" (UID: \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\") " Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.075733 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-utilities\") pod \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\" (UID: \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\") " Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.075855 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-catalog-content\") pod \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\" (UID: \"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59\") " Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.076981 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-utilities" (OuterVolumeSpecName: "utilities") pod "0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" (UID: "0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.079591 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-kube-api-access-7fs8d" (OuterVolumeSpecName: "kube-api-access-7fs8d") pod "0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" (UID: "0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59"). InnerVolumeSpecName "kube-api-access-7fs8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.142609 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" (UID: "0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.177773 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fs8d\" (UniqueName: \"kubernetes.io/projected/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-kube-api-access-7fs8d\") on node \"crc\" DevicePath \"\"" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.177814 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.177827 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.248705 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lk5dw"] Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.249219 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lk5dw" podUID="5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" containerName="registry-server" containerID="cri-o://cfb2ebbafcd5ccb355d156f09b5c73381631cc3342cde9a1d6fbf56cc7f6b3c2" gracePeriod=2 Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.384575 4715 generic.go:334] "Generic (PLEG): container finished" podID="0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" containerID="189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c" exitCode=0 Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.384937 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lkj8c" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.385005 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkj8c" event={"ID":"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59","Type":"ContainerDied","Data":"189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c"} Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.385064 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lkj8c" event={"ID":"0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59","Type":"ContainerDied","Data":"6203d4bf04933415c0690aa2e9bac7207c0ebe67bd4f13ff6be827571ba06b8b"} Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.385084 4715 scope.go:117] "RemoveContainer" containerID="189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.389387 4715 generic.go:334] "Generic (PLEG): container finished" podID="5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" containerID="cfb2ebbafcd5ccb355d156f09b5c73381631cc3342cde9a1d6fbf56cc7f6b3c2" exitCode=0 Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.389452 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lk5dw" event={"ID":"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b","Type":"ContainerDied","Data":"cfb2ebbafcd5ccb355d156f09b5c73381631cc3342cde9a1d6fbf56cc7f6b3c2"} Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.389618 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-86kfg" podUID="c1d62d9a-da53-47f0-80e9-8cb838bb39cd" containerName="registry-server" containerID="cri-o://57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92" gracePeriod=2 Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.422138 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lkj8c"] Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.434280 4715 scope.go:117] "RemoveContainer" containerID="a18dfe2c614159e1d29be39eb864cf37d5025e070738f49da2a75153bdd4abdd" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.452628 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lkj8c"] Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.678234 4715 scope.go:117] "RemoveContainer" containerID="d25258f4dbc5e3babe3f15f4b82d468cef824d299c77e0a4f0994d66485f367a" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.740992 4715 scope.go:117] "RemoveContainer" containerID="189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c" Oct 09 08:32:15 crc kubenswrapper[4715]: E1009 08:32:15.743178 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c\": container with ID starting with 189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c not found: ID does not exist" containerID="189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.743209 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c"} err="failed to get container status \"189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c\": rpc error: code = NotFound desc = could not find container \"189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c\": container with ID starting with 189c2519255781da8723cc788cab880d486590990b46ee71d65c00c1f6ac102c not found: ID does not exist" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.743258 4715 scope.go:117] "RemoveContainer" containerID="a18dfe2c614159e1d29be39eb864cf37d5025e070738f49da2a75153bdd4abdd" Oct 09 08:32:15 crc kubenswrapper[4715]: E1009 08:32:15.743968 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a18dfe2c614159e1d29be39eb864cf37d5025e070738f49da2a75153bdd4abdd\": container with ID starting with a18dfe2c614159e1d29be39eb864cf37d5025e070738f49da2a75153bdd4abdd not found: ID does not exist" containerID="a18dfe2c614159e1d29be39eb864cf37d5025e070738f49da2a75153bdd4abdd" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.743992 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18dfe2c614159e1d29be39eb864cf37d5025e070738f49da2a75153bdd4abdd"} err="failed to get container status \"a18dfe2c614159e1d29be39eb864cf37d5025e070738f49da2a75153bdd4abdd\": rpc error: code = NotFound desc = could not find container \"a18dfe2c614159e1d29be39eb864cf37d5025e070738f49da2a75153bdd4abdd\": container with ID starting with a18dfe2c614159e1d29be39eb864cf37d5025e070738f49da2a75153bdd4abdd not found: ID does not exist" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.744006 4715 scope.go:117] "RemoveContainer" containerID="d25258f4dbc5e3babe3f15f4b82d468cef824d299c77e0a4f0994d66485f367a" Oct 09 08:32:15 crc kubenswrapper[4715]: E1009 08:32:15.744339 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d25258f4dbc5e3babe3f15f4b82d468cef824d299c77e0a4f0994d66485f367a\": container with ID starting with d25258f4dbc5e3babe3f15f4b82d468cef824d299c77e0a4f0994d66485f367a not found: ID does not exist" containerID="d25258f4dbc5e3babe3f15f4b82d468cef824d299c77e0a4f0994d66485f367a" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.744359 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d25258f4dbc5e3babe3f15f4b82d468cef824d299c77e0a4f0994d66485f367a"} err="failed to get container status \"d25258f4dbc5e3babe3f15f4b82d468cef824d299c77e0a4f0994d66485f367a\": rpc error: code = NotFound desc = could not find container \"d25258f4dbc5e3babe3f15f4b82d468cef824d299c77e0a4f0994d66485f367a\": container with ID starting with d25258f4dbc5e3babe3f15f4b82d468cef824d299c77e0a4f0994d66485f367a not found: ID does not exist" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.937640 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.975930 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.995665 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7xr6\" (UniqueName: \"kubernetes.io/projected/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-kube-api-access-g7xr6\") pod \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\" (UID: \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\") " Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.995859 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-catalog-content\") pod \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\" (UID: \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\") " Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.996133 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-utilities\") pod \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\" (UID: \"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b\") " Oct 09 08:32:15 crc kubenswrapper[4715]: I1009 08:32:15.997311 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-utilities" (OuterVolumeSpecName: "utilities") pod "5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" (UID: "5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.002920 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-kube-api-access-g7xr6" (OuterVolumeSpecName: "kube-api-access-g7xr6") pod "5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" (UID: "5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b"). InnerVolumeSpecName "kube-api-access-g7xr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.012084 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" (UID: "5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.097358 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-utilities\") pod \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\" (UID: \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\") " Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.097511 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngtsv\" (UniqueName: \"kubernetes.io/projected/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-kube-api-access-ngtsv\") pod \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\" (UID: \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\") " Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.097610 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-catalog-content\") pod \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\" (UID: \"c1d62d9a-da53-47f0-80e9-8cb838bb39cd\") " Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.098172 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7xr6\" (UniqueName: \"kubernetes.io/projected/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-kube-api-access-g7xr6\") on node \"crc\" DevicePath \"\"" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.098200 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.098215 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.098187 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-utilities" (OuterVolumeSpecName: "utilities") pod "c1d62d9a-da53-47f0-80e9-8cb838bb39cd" (UID: "c1d62d9a-da53-47f0-80e9-8cb838bb39cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.104779 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-kube-api-access-ngtsv" (OuterVolumeSpecName: "kube-api-access-ngtsv") pod "c1d62d9a-da53-47f0-80e9-8cb838bb39cd" (UID: "c1d62d9a-da53-47f0-80e9-8cb838bb39cd"). InnerVolumeSpecName "kube-api-access-ngtsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.148063 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" path="/var/lib/kubelet/pods/0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59/volumes" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.200171 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.200220 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngtsv\" (UniqueName: \"kubernetes.io/projected/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-kube-api-access-ngtsv\") on node \"crc\" DevicePath \"\"" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.201681 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1d62d9a-da53-47f0-80e9-8cb838bb39cd" (UID: "c1d62d9a-da53-47f0-80e9-8cb838bb39cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.301835 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1d62d9a-da53-47f0-80e9-8cb838bb39cd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.404025 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lk5dw" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.404053 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lk5dw" event={"ID":"5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b","Type":"ContainerDied","Data":"3e1f5b2c61f5dad1201d9b53124f28dca31f6643ac0e7c4dc3bf63845f8ac7d6"} Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.404395 4715 scope.go:117] "RemoveContainer" containerID="cfb2ebbafcd5ccb355d156f09b5c73381631cc3342cde9a1d6fbf56cc7f6b3c2" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.409295 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c272fa72-6434-4af1-8e2b-433cc9f619ea","Type":"ContainerStarted","Data":"f8971272e21a1b1f813c9a3eecd73a61b075311309e74532b5899bc79cb3ba3b"} Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.413974 4715 generic.go:334] "Generic (PLEG): container finished" podID="c1d62d9a-da53-47f0-80e9-8cb838bb39cd" containerID="57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92" exitCode=0 Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.414027 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kfg" event={"ID":"c1d62d9a-da53-47f0-80e9-8cb838bb39cd","Type":"ContainerDied","Data":"57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92"} Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.414059 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86kfg" event={"ID":"c1d62d9a-da53-47f0-80e9-8cb838bb39cd","Type":"ContainerDied","Data":"5510dcf68b97756fe42c937c712ee9190f94724e99febe4567d96b72f0a610d4"} Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.414149 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86kfg" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.441292 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.436571199 podStartE2EDuration="51.441272754s" podCreationTimestamp="2025-10-09 08:31:25 +0000 UTC" firstStartedPulling="2025-10-09 08:31:27.887288975 +0000 UTC m=+2718.580092983" lastFinishedPulling="2025-10-09 08:32:14.89199051 +0000 UTC m=+2765.584794538" observedRunningTime="2025-10-09 08:32:16.427639192 +0000 UTC m=+2767.120443200" watchObservedRunningTime="2025-10-09 08:32:16.441272754 +0000 UTC m=+2767.134076772" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.451314 4715 scope.go:117] "RemoveContainer" containerID="0f3e422f0dcff8600bfeda1f3ce1a35a18de33e3093cd075ec54b76ae3c9d87f" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.456984 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lk5dw"] Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.477165 4715 scope.go:117] "RemoveContainer" containerID="ecbc7b15d41f510ce22b20d16303a63f67348ea81481c3c07f3ae57b3f07994a" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.477679 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lk5dw"] Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.488919 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86kfg"] Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.499295 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-86kfg"] Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.560996 4715 scope.go:117] "RemoveContainer" containerID="57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.591163 4715 scope.go:117] "RemoveContainer" containerID="2118a6f2befc6d613af7f26172e8dc6d18c4af49fd5909da557ecb4c6d6c8499" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.635838 4715 scope.go:117] "RemoveContainer" containerID="43784b611869201a47034d2b66ab75a3792105b502455fe215518f2b3bc3488c" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.661321 4715 scope.go:117] "RemoveContainer" containerID="57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92" Oct 09 08:32:16 crc kubenswrapper[4715]: E1009 08:32:16.662159 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92\": container with ID starting with 57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92 not found: ID does not exist" containerID="57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.662277 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92"} err="failed to get container status \"57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92\": rpc error: code = NotFound desc = could not find container \"57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92\": container with ID starting with 57d4934f633c2497a4b3822af26bac90361effa6a55269b04953523174be2d92 not found: ID does not exist" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.662363 4715 scope.go:117] "RemoveContainer" containerID="2118a6f2befc6d613af7f26172e8dc6d18c4af49fd5909da557ecb4c6d6c8499" Oct 09 08:32:16 crc kubenswrapper[4715]: E1009 08:32:16.662753 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2118a6f2befc6d613af7f26172e8dc6d18c4af49fd5909da557ecb4c6d6c8499\": container with ID starting with 2118a6f2befc6d613af7f26172e8dc6d18c4af49fd5909da557ecb4c6d6c8499 not found: ID does not exist" containerID="2118a6f2befc6d613af7f26172e8dc6d18c4af49fd5909da557ecb4c6d6c8499" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.662844 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2118a6f2befc6d613af7f26172e8dc6d18c4af49fd5909da557ecb4c6d6c8499"} err="failed to get container status \"2118a6f2befc6d613af7f26172e8dc6d18c4af49fd5909da557ecb4c6d6c8499\": rpc error: code = NotFound desc = could not find container \"2118a6f2befc6d613af7f26172e8dc6d18c4af49fd5909da557ecb4c6d6c8499\": container with ID starting with 2118a6f2befc6d613af7f26172e8dc6d18c4af49fd5909da557ecb4c6d6c8499 not found: ID does not exist" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.662941 4715 scope.go:117] "RemoveContainer" containerID="43784b611869201a47034d2b66ab75a3792105b502455fe215518f2b3bc3488c" Oct 09 08:32:16 crc kubenswrapper[4715]: E1009 08:32:16.663255 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43784b611869201a47034d2b66ab75a3792105b502455fe215518f2b3bc3488c\": container with ID starting with 43784b611869201a47034d2b66ab75a3792105b502455fe215518f2b3bc3488c not found: ID does not exist" containerID="43784b611869201a47034d2b66ab75a3792105b502455fe215518f2b3bc3488c" Oct 09 08:32:16 crc kubenswrapper[4715]: I1009 08:32:16.663341 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43784b611869201a47034d2b66ab75a3792105b502455fe215518f2b3bc3488c"} err="failed to get container status \"43784b611869201a47034d2b66ab75a3792105b502455fe215518f2b3bc3488c\": rpc error: code = NotFound desc = could not find container \"43784b611869201a47034d2b66ab75a3792105b502455fe215518f2b3bc3488c\": container with ID starting with 43784b611869201a47034d2b66ab75a3792105b502455fe215518f2b3bc3488c not found: ID does not exist" Oct 09 08:32:18 crc kubenswrapper[4715]: I1009 08:32:18.156623 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" path="/var/lib/kubelet/pods/5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b/volumes" Oct 09 08:32:18 crc kubenswrapper[4715]: I1009 08:32:18.157932 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1d62d9a-da53-47f0-80e9-8cb838bb39cd" path="/var/lib/kubelet/pods/c1d62d9a-da53-47f0-80e9-8cb838bb39cd/volumes" Oct 09 08:32:18 crc kubenswrapper[4715]: E1009 08:32:18.923577 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice/crio-2ec7b4979f4f0f1c4ba2801b0c36dc86bcd3db9b8bf16d1a62d1597e283d2c72\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice\": RecentStats: unable to find data in memory cache]" Oct 09 08:32:29 crc kubenswrapper[4715]: E1009 08:32:29.174636 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice/crio-2ec7b4979f4f0f1c4ba2801b0c36dc86bcd3db9b8bf16d1a62d1597e283d2c72\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice\": RecentStats: unable to find data in memory cache]" Oct 09 08:32:39 crc kubenswrapper[4715]: E1009 08:32:39.468929 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice/crio-2ec7b4979f4f0f1c4ba2801b0c36dc86bcd3db9b8bf16d1a62d1597e283d2c72\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice\": RecentStats: unable to find data in memory cache]" Oct 09 08:32:49 crc kubenswrapper[4715]: E1009 08:32:49.737248 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice/crio-2ec7b4979f4f0f1c4ba2801b0c36dc86bcd3db9b8bf16d1a62d1597e283d2c72\": RecentStats: unable to find data in memory cache]" Oct 09 08:32:59 crc kubenswrapper[4715]: E1009 08:32:59.965108 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice/crio-2ec7b4979f4f0f1c4ba2801b0c36dc86bcd3db9b8bf16d1a62d1597e283d2c72\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice\": RecentStats: unable to find data in memory cache]" Oct 09 08:33:10 crc kubenswrapper[4715]: E1009 08:33:10.258339 4715 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice/crio-2ec7b4979f4f0f1c4ba2801b0c36dc86bcd3db9b8bf16d1a62d1597e283d2c72\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod944f4e67_23a1_4024_a5c2_180f17dea29c.slice\": RecentStats: unable to find data in memory cache]" Oct 09 08:33:16 crc kubenswrapper[4715]: I1009 08:33:16.753132 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:33:16 crc kubenswrapper[4715]: I1009 08:33:16.753853 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:33:46 crc kubenswrapper[4715]: I1009 08:33:46.753821 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:33:46 crc kubenswrapper[4715]: I1009 08:33:46.754310 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:34:16 crc kubenswrapper[4715]: I1009 08:34:16.754495 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:34:16 crc kubenswrapper[4715]: I1009 08:34:16.755134 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:34:16 crc kubenswrapper[4715]: I1009 08:34:16.755209 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 08:34:16 crc kubenswrapper[4715]: I1009 08:34:16.756392 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d3a521bcd0930bef52f3187725336b9565b7b96aa8ea310dc91dc5198d53fb8"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 08:34:16 crc kubenswrapper[4715]: I1009 08:34:16.756531 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://0d3a521bcd0930bef52f3187725336b9565b7b96aa8ea310dc91dc5198d53fb8" gracePeriod=600 Oct 09 08:34:17 crc kubenswrapper[4715]: I1009 08:34:17.737613 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="0d3a521bcd0930bef52f3187725336b9565b7b96aa8ea310dc91dc5198d53fb8" exitCode=0 Oct 09 08:34:17 crc kubenswrapper[4715]: I1009 08:34:17.737701 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"0d3a521bcd0930bef52f3187725336b9565b7b96aa8ea310dc91dc5198d53fb8"} Oct 09 08:34:17 crc kubenswrapper[4715]: I1009 08:34:17.738274 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c"} Oct 09 08:34:17 crc kubenswrapper[4715]: I1009 08:34:17.738301 4715 scope.go:117] "RemoveContainer" containerID="9e11f343d65eb6f97a170083c57eca61210ee13b07c89aa31cd70c6848304c28" Oct 09 08:36:46 crc kubenswrapper[4715]: I1009 08:36:46.753809 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:36:46 crc kubenswrapper[4715]: I1009 08:36:46.755720 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:37:16 crc kubenswrapper[4715]: I1009 08:37:16.754198 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:37:16 crc kubenswrapper[4715]: I1009 08:37:16.754636 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:37:46 crc kubenswrapper[4715]: I1009 08:37:46.753744 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:37:46 crc kubenswrapper[4715]: I1009 08:37:46.754306 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:37:46 crc kubenswrapper[4715]: I1009 08:37:46.754350 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 08:37:46 crc kubenswrapper[4715]: I1009 08:37:46.755134 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 08:37:46 crc kubenswrapper[4715]: I1009 08:37:46.755196 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" gracePeriod=600 Oct 09 08:37:46 crc kubenswrapper[4715]: E1009 08:37:46.892222 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:37:47 crc kubenswrapper[4715]: I1009 08:37:47.807221 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" exitCode=0 Oct 09 08:37:47 crc kubenswrapper[4715]: I1009 08:37:47.807281 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c"} Oct 09 08:37:47 crc kubenswrapper[4715]: I1009 08:37:47.807604 4715 scope.go:117] "RemoveContainer" containerID="0d3a521bcd0930bef52f3187725336b9565b7b96aa8ea310dc91dc5198d53fb8" Oct 09 08:37:47 crc kubenswrapper[4715]: I1009 08:37:47.808743 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:37:47 crc kubenswrapper[4715]: E1009 08:37:47.809104 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:37:58 crc kubenswrapper[4715]: I1009 08:37:58.137571 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:37:58 crc kubenswrapper[4715]: E1009 08:37:58.138250 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:38:09 crc kubenswrapper[4715]: I1009 08:38:09.136919 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:38:09 crc kubenswrapper[4715]: E1009 08:38:09.137595 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:38:24 crc kubenswrapper[4715]: I1009 08:38:24.137148 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:38:24 crc kubenswrapper[4715]: E1009 08:38:24.138466 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:38:36 crc kubenswrapper[4715]: I1009 08:38:36.136532 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:38:36 crc kubenswrapper[4715]: E1009 08:38:36.137188 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:38:49 crc kubenswrapper[4715]: I1009 08:38:49.137578 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:38:49 crc kubenswrapper[4715]: E1009 08:38:49.138447 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:39:02 crc kubenswrapper[4715]: I1009 08:39:02.137579 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:39:02 crc kubenswrapper[4715]: E1009 08:39:02.138303 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:39:16 crc kubenswrapper[4715]: I1009 08:39:16.136585 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:39:16 crc kubenswrapper[4715]: E1009 08:39:16.137283 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:39:31 crc kubenswrapper[4715]: I1009 08:39:31.137634 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:39:31 crc kubenswrapper[4715]: E1009 08:39:31.138552 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:39:44 crc kubenswrapper[4715]: I1009 08:39:44.137121 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:39:44 crc kubenswrapper[4715]: E1009 08:39:44.139300 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:39:58 crc kubenswrapper[4715]: I1009 08:39:58.137747 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:39:58 crc kubenswrapper[4715]: E1009 08:39:58.138855 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:40:13 crc kubenswrapper[4715]: I1009 08:40:13.137700 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:40:13 crc kubenswrapper[4715]: E1009 08:40:13.138565 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:40:25 crc kubenswrapper[4715]: I1009 08:40:25.137084 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:40:25 crc kubenswrapper[4715]: E1009 08:40:25.138009 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:40:40 crc kubenswrapper[4715]: I1009 08:40:40.142772 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:40:40 crc kubenswrapper[4715]: E1009 08:40:40.144476 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:40:52 crc kubenswrapper[4715]: I1009 08:40:52.137149 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:40:52 crc kubenswrapper[4715]: E1009 08:40:52.138344 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:41:05 crc kubenswrapper[4715]: I1009 08:41:05.136579 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:41:05 crc kubenswrapper[4715]: E1009 08:41:05.137591 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:41:18 crc kubenswrapper[4715]: I1009 08:41:18.137307 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:41:18 crc kubenswrapper[4715]: E1009 08:41:18.138139 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:41:33 crc kubenswrapper[4715]: I1009 08:41:33.137057 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:41:33 crc kubenswrapper[4715]: E1009 08:41:33.138069 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:41:48 crc kubenswrapper[4715]: I1009 08:41:48.137549 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:41:48 crc kubenswrapper[4715]: E1009 08:41:48.138318 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:41:59 crc kubenswrapper[4715]: I1009 08:41:59.137525 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:41:59 crc kubenswrapper[4715]: E1009 08:41:59.139116 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.708790 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wq2bx"] Oct 09 08:42:01 crc kubenswrapper[4715]: E1009 08:42:01.712924 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d62d9a-da53-47f0-80e9-8cb838bb39cd" containerName="extract-utilities" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.712954 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d62d9a-da53-47f0-80e9-8cb838bb39cd" containerName="extract-utilities" Oct 09 08:42:01 crc kubenswrapper[4715]: E1009 08:42:01.712997 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d62d9a-da53-47f0-80e9-8cb838bb39cd" containerName="extract-content" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713010 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d62d9a-da53-47f0-80e9-8cb838bb39cd" containerName="extract-content" Oct 09 08:42:01 crc kubenswrapper[4715]: E1009 08:42:01.713030 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" containerName="extract-content" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713042 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" containerName="extract-content" Oct 09 08:42:01 crc kubenswrapper[4715]: E1009 08:42:01.713074 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944f4e67-23a1-4024-a5c2-180f17dea29c" containerName="registry-server" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713085 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="944f4e67-23a1-4024-a5c2-180f17dea29c" containerName="registry-server" Oct 09 08:42:01 crc kubenswrapper[4715]: E1009 08:42:01.713108 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d62d9a-da53-47f0-80e9-8cb838bb39cd" containerName="registry-server" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713119 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d62d9a-da53-47f0-80e9-8cb838bb39cd" containerName="registry-server" Oct 09 08:42:01 crc kubenswrapper[4715]: E1009 08:42:01.713133 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" containerName="extract-utilities" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713144 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" containerName="extract-utilities" Oct 09 08:42:01 crc kubenswrapper[4715]: E1009 08:42:01.713158 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" containerName="registry-server" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713168 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" containerName="registry-server" Oct 09 08:42:01 crc kubenswrapper[4715]: E1009 08:42:01.713186 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944f4e67-23a1-4024-a5c2-180f17dea29c" containerName="extract-utilities" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713199 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="944f4e67-23a1-4024-a5c2-180f17dea29c" containerName="extract-utilities" Oct 09 08:42:01 crc kubenswrapper[4715]: E1009 08:42:01.713223 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" containerName="extract-utilities" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713234 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" containerName="extract-utilities" Oct 09 08:42:01 crc kubenswrapper[4715]: E1009 08:42:01.713257 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" containerName="extract-content" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713271 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" containerName="extract-content" Oct 09 08:42:01 crc kubenswrapper[4715]: E1009 08:42:01.713289 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944f4e67-23a1-4024-a5c2-180f17dea29c" containerName="extract-content" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713300 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="944f4e67-23a1-4024-a5c2-180f17dea29c" containerName="extract-content" Oct 09 08:42:01 crc kubenswrapper[4715]: E1009 08:42:01.713317 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" containerName="registry-server" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713328 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" containerName="registry-server" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713681 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd78bc5-ee35-4e68-9190-d0bf4c3b2a9b" containerName="registry-server" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713704 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bfb47ed-6dce-4b9b-ad71-7f02b31fcf59" containerName="registry-server" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713743 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1d62d9a-da53-47f0-80e9-8cb838bb39cd" containerName="registry-server" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.713765 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="944f4e67-23a1-4024-a5c2-180f17dea29c" containerName="registry-server" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.716030 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.723874 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wq2bx"] Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.828093 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/236464f9-1eb5-4fd5-96e0-e8a70006ca16-utilities\") pod \"certified-operators-wq2bx\" (UID: \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\") " pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.828244 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/236464f9-1eb5-4fd5-96e0-e8a70006ca16-catalog-content\") pod \"certified-operators-wq2bx\" (UID: \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\") " pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.828300 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f5ds\" (UniqueName: \"kubernetes.io/projected/236464f9-1eb5-4fd5-96e0-e8a70006ca16-kube-api-access-4f5ds\") pod \"certified-operators-wq2bx\" (UID: \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\") " pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.929845 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/236464f9-1eb5-4fd5-96e0-e8a70006ca16-utilities\") pod \"certified-operators-wq2bx\" (UID: \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\") " pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.929990 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/236464f9-1eb5-4fd5-96e0-e8a70006ca16-catalog-content\") pod \"certified-operators-wq2bx\" (UID: \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\") " pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.930051 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f5ds\" (UniqueName: \"kubernetes.io/projected/236464f9-1eb5-4fd5-96e0-e8a70006ca16-kube-api-access-4f5ds\") pod \"certified-operators-wq2bx\" (UID: \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\") " pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.930442 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/236464f9-1eb5-4fd5-96e0-e8a70006ca16-utilities\") pod \"certified-operators-wq2bx\" (UID: \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\") " pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.930442 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/236464f9-1eb5-4fd5-96e0-e8a70006ca16-catalog-content\") pod \"certified-operators-wq2bx\" (UID: \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\") " pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:01 crc kubenswrapper[4715]: I1009 08:42:01.952270 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f5ds\" (UniqueName: \"kubernetes.io/projected/236464f9-1eb5-4fd5-96e0-e8a70006ca16-kube-api-access-4f5ds\") pod \"certified-operators-wq2bx\" (UID: \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\") " pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:02 crc kubenswrapper[4715]: I1009 08:42:02.045099 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:02 crc kubenswrapper[4715]: I1009 08:42:02.570577 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wq2bx"] Oct 09 08:42:03 crc kubenswrapper[4715]: I1009 08:42:03.375506 4715 generic.go:334] "Generic (PLEG): container finished" podID="236464f9-1eb5-4fd5-96e0-e8a70006ca16" containerID="ddae13cf23e731a21032b4cbe3c056523560a68bbd40e727120e0aeb9c9cda1a" exitCode=0 Oct 09 08:42:03 crc kubenswrapper[4715]: I1009 08:42:03.375562 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq2bx" event={"ID":"236464f9-1eb5-4fd5-96e0-e8a70006ca16","Type":"ContainerDied","Data":"ddae13cf23e731a21032b4cbe3c056523560a68bbd40e727120e0aeb9c9cda1a"} Oct 09 08:42:03 crc kubenswrapper[4715]: I1009 08:42:03.375602 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq2bx" event={"ID":"236464f9-1eb5-4fd5-96e0-e8a70006ca16","Type":"ContainerStarted","Data":"ca3ce05098b544f6bb5dbc460c7b961294a7fc1e0fdbcc7f21a2db42ff9724a0"} Oct 09 08:42:03 crc kubenswrapper[4715]: I1009 08:42:03.379653 4715 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 08:42:05 crc kubenswrapper[4715]: I1009 08:42:05.395687 4715 generic.go:334] "Generic (PLEG): container finished" podID="236464f9-1eb5-4fd5-96e0-e8a70006ca16" containerID="99aaad61a97a2a1d0023240a9b12be913b0aa7f85a1f9504d4b01205819a92eb" exitCode=0 Oct 09 08:42:05 crc kubenswrapper[4715]: I1009 08:42:05.395730 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq2bx" event={"ID":"236464f9-1eb5-4fd5-96e0-e8a70006ca16","Type":"ContainerDied","Data":"99aaad61a97a2a1d0023240a9b12be913b0aa7f85a1f9504d4b01205819a92eb"} Oct 09 08:42:06 crc kubenswrapper[4715]: I1009 08:42:06.408604 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq2bx" event={"ID":"236464f9-1eb5-4fd5-96e0-e8a70006ca16","Type":"ContainerStarted","Data":"758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0"} Oct 09 08:42:06 crc kubenswrapper[4715]: I1009 08:42:06.431911 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wq2bx" podStartSLOduration=2.9354923 podStartE2EDuration="5.431885419s" podCreationTimestamp="2025-10-09 08:42:01 +0000 UTC" firstStartedPulling="2025-10-09 08:42:03.379368897 +0000 UTC m=+3354.072172915" lastFinishedPulling="2025-10-09 08:42:05.875762026 +0000 UTC m=+3356.568566034" observedRunningTime="2025-10-09 08:42:06.430383676 +0000 UTC m=+3357.123187714" watchObservedRunningTime="2025-10-09 08:42:06.431885419 +0000 UTC m=+3357.124689447" Oct 09 08:42:11 crc kubenswrapper[4715]: I1009 08:42:11.137242 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:42:11 crc kubenswrapper[4715]: E1009 08:42:11.138137 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:42:12 crc kubenswrapper[4715]: I1009 08:42:12.045593 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:12 crc kubenswrapper[4715]: I1009 08:42:12.046471 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:12 crc kubenswrapper[4715]: I1009 08:42:12.093945 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:12 crc kubenswrapper[4715]: I1009 08:42:12.525532 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:12 crc kubenswrapper[4715]: I1009 08:42:12.570819 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wq2bx"] Oct 09 08:42:14 crc kubenswrapper[4715]: I1009 08:42:14.484909 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wq2bx" podUID="236464f9-1eb5-4fd5-96e0-e8a70006ca16" containerName="registry-server" containerID="cri-o://758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0" gracePeriod=2 Oct 09 08:42:14 crc kubenswrapper[4715]: I1009 08:42:14.996524 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.177163 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f5ds\" (UniqueName: \"kubernetes.io/projected/236464f9-1eb5-4fd5-96e0-e8a70006ca16-kube-api-access-4f5ds\") pod \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\" (UID: \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\") " Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.177277 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/236464f9-1eb5-4fd5-96e0-e8a70006ca16-catalog-content\") pod \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\" (UID: \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\") " Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.177452 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/236464f9-1eb5-4fd5-96e0-e8a70006ca16-utilities\") pod \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\" (UID: \"236464f9-1eb5-4fd5-96e0-e8a70006ca16\") " Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.178676 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/236464f9-1eb5-4fd5-96e0-e8a70006ca16-utilities" (OuterVolumeSpecName: "utilities") pod "236464f9-1eb5-4fd5-96e0-e8a70006ca16" (UID: "236464f9-1eb5-4fd5-96e0-e8a70006ca16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.187680 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236464f9-1eb5-4fd5-96e0-e8a70006ca16-kube-api-access-4f5ds" (OuterVolumeSpecName: "kube-api-access-4f5ds") pod "236464f9-1eb5-4fd5-96e0-e8a70006ca16" (UID: "236464f9-1eb5-4fd5-96e0-e8a70006ca16"). InnerVolumeSpecName "kube-api-access-4f5ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.236645 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/236464f9-1eb5-4fd5-96e0-e8a70006ca16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "236464f9-1eb5-4fd5-96e0-e8a70006ca16" (UID: "236464f9-1eb5-4fd5-96e0-e8a70006ca16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.280007 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/236464f9-1eb5-4fd5-96e0-e8a70006ca16-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.280043 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f5ds\" (UniqueName: \"kubernetes.io/projected/236464f9-1eb5-4fd5-96e0-e8a70006ca16-kube-api-access-4f5ds\") on node \"crc\" DevicePath \"\"" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.280055 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/236464f9-1eb5-4fd5-96e0-e8a70006ca16-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.495151 4715 generic.go:334] "Generic (PLEG): container finished" podID="236464f9-1eb5-4fd5-96e0-e8a70006ca16" containerID="758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0" exitCode=0 Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.495196 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wq2bx" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.495203 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq2bx" event={"ID":"236464f9-1eb5-4fd5-96e0-e8a70006ca16","Type":"ContainerDied","Data":"758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0"} Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.495242 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wq2bx" event={"ID":"236464f9-1eb5-4fd5-96e0-e8a70006ca16","Type":"ContainerDied","Data":"ca3ce05098b544f6bb5dbc460c7b961294a7fc1e0fdbcc7f21a2db42ff9724a0"} Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.495264 4715 scope.go:117] "RemoveContainer" containerID="758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.516033 4715 scope.go:117] "RemoveContainer" containerID="99aaad61a97a2a1d0023240a9b12be913b0aa7f85a1f9504d4b01205819a92eb" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.531871 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wq2bx"] Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.544975 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wq2bx"] Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.552159 4715 scope.go:117] "RemoveContainer" containerID="ddae13cf23e731a21032b4cbe3c056523560a68bbd40e727120e0aeb9c9cda1a" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.583328 4715 scope.go:117] "RemoveContainer" containerID="758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0" Oct 09 08:42:15 crc kubenswrapper[4715]: E1009 08:42:15.583768 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0\": container with ID starting with 758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0 not found: ID does not exist" containerID="758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.583814 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0"} err="failed to get container status \"758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0\": rpc error: code = NotFound desc = could not find container \"758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0\": container with ID starting with 758115a387f94a7939a5ecff12e3ed9c4397a9c667a9f1afac78b71342f55ab0 not found: ID does not exist" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.583842 4715 scope.go:117] "RemoveContainer" containerID="99aaad61a97a2a1d0023240a9b12be913b0aa7f85a1f9504d4b01205819a92eb" Oct 09 08:42:15 crc kubenswrapper[4715]: E1009 08:42:15.584293 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99aaad61a97a2a1d0023240a9b12be913b0aa7f85a1f9504d4b01205819a92eb\": container with ID starting with 99aaad61a97a2a1d0023240a9b12be913b0aa7f85a1f9504d4b01205819a92eb not found: ID does not exist" containerID="99aaad61a97a2a1d0023240a9b12be913b0aa7f85a1f9504d4b01205819a92eb" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.584328 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99aaad61a97a2a1d0023240a9b12be913b0aa7f85a1f9504d4b01205819a92eb"} err="failed to get container status \"99aaad61a97a2a1d0023240a9b12be913b0aa7f85a1f9504d4b01205819a92eb\": rpc error: code = NotFound desc = could not find container \"99aaad61a97a2a1d0023240a9b12be913b0aa7f85a1f9504d4b01205819a92eb\": container with ID starting with 99aaad61a97a2a1d0023240a9b12be913b0aa7f85a1f9504d4b01205819a92eb not found: ID does not exist" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.584355 4715 scope.go:117] "RemoveContainer" containerID="ddae13cf23e731a21032b4cbe3c056523560a68bbd40e727120e0aeb9c9cda1a" Oct 09 08:42:15 crc kubenswrapper[4715]: E1009 08:42:15.584645 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddae13cf23e731a21032b4cbe3c056523560a68bbd40e727120e0aeb9c9cda1a\": container with ID starting with ddae13cf23e731a21032b4cbe3c056523560a68bbd40e727120e0aeb9c9cda1a not found: ID does not exist" containerID="ddae13cf23e731a21032b4cbe3c056523560a68bbd40e727120e0aeb9c9cda1a" Oct 09 08:42:15 crc kubenswrapper[4715]: I1009 08:42:15.584672 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddae13cf23e731a21032b4cbe3c056523560a68bbd40e727120e0aeb9c9cda1a"} err="failed to get container status \"ddae13cf23e731a21032b4cbe3c056523560a68bbd40e727120e0aeb9c9cda1a\": rpc error: code = NotFound desc = could not find container \"ddae13cf23e731a21032b4cbe3c056523560a68bbd40e727120e0aeb9c9cda1a\": container with ID starting with ddae13cf23e731a21032b4cbe3c056523560a68bbd40e727120e0aeb9c9cda1a not found: ID does not exist" Oct 09 08:42:16 crc kubenswrapper[4715]: I1009 08:42:16.147341 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236464f9-1eb5-4fd5-96e0-e8a70006ca16" path="/var/lib/kubelet/pods/236464f9-1eb5-4fd5-96e0-e8a70006ca16/volumes" Oct 09 08:42:23 crc kubenswrapper[4715]: I1009 08:42:23.137857 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:42:23 crc kubenswrapper[4715]: E1009 08:42:23.138714 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.539145 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hwrl4"] Oct 09 08:42:26 crc kubenswrapper[4715]: E1009 08:42:26.540900 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236464f9-1eb5-4fd5-96e0-e8a70006ca16" containerName="extract-utilities" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.540934 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="236464f9-1eb5-4fd5-96e0-e8a70006ca16" containerName="extract-utilities" Oct 09 08:42:26 crc kubenswrapper[4715]: E1009 08:42:26.540989 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236464f9-1eb5-4fd5-96e0-e8a70006ca16" containerName="extract-content" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.541009 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="236464f9-1eb5-4fd5-96e0-e8a70006ca16" containerName="extract-content" Oct 09 08:42:26 crc kubenswrapper[4715]: E1009 08:42:26.541074 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236464f9-1eb5-4fd5-96e0-e8a70006ca16" containerName="registry-server" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.541092 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="236464f9-1eb5-4fd5-96e0-e8a70006ca16" containerName="registry-server" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.541750 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="236464f9-1eb5-4fd5-96e0-e8a70006ca16" containerName="registry-server" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.545612 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.564664 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hwrl4"] Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.720608 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-catalog-content\") pod \"redhat-operators-hwrl4\" (UID: \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\") " pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.720699 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-utilities\") pod \"redhat-operators-hwrl4\" (UID: \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\") " pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.720781 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmzrb\" (UniqueName: \"kubernetes.io/projected/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-kube-api-access-tmzrb\") pod \"redhat-operators-hwrl4\" (UID: \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\") " pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.822632 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-utilities\") pod \"redhat-operators-hwrl4\" (UID: \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\") " pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.822754 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmzrb\" (UniqueName: \"kubernetes.io/projected/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-kube-api-access-tmzrb\") pod \"redhat-operators-hwrl4\" (UID: \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\") " pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.822871 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-catalog-content\") pod \"redhat-operators-hwrl4\" (UID: \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\") " pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.823224 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-utilities\") pod \"redhat-operators-hwrl4\" (UID: \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\") " pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.823337 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-catalog-content\") pod \"redhat-operators-hwrl4\" (UID: \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\") " pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.848702 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmzrb\" (UniqueName: \"kubernetes.io/projected/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-kube-api-access-tmzrb\") pod \"redhat-operators-hwrl4\" (UID: \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\") " pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:26 crc kubenswrapper[4715]: I1009 08:42:26.872288 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:27 crc kubenswrapper[4715]: I1009 08:42:27.327707 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hwrl4"] Oct 09 08:42:27 crc kubenswrapper[4715]: I1009 08:42:27.658726 4715 generic.go:334] "Generic (PLEG): container finished" podID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerID="f0c41b35384af1afee6d52d84a7b8158c680364acd0a7aa349081f90263fc6dd" exitCode=0 Oct 09 08:42:27 crc kubenswrapper[4715]: I1009 08:42:27.658758 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrl4" event={"ID":"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70","Type":"ContainerDied","Data":"f0c41b35384af1afee6d52d84a7b8158c680364acd0a7aa349081f90263fc6dd"} Oct 09 08:42:27 crc kubenswrapper[4715]: I1009 08:42:27.658856 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrl4" event={"ID":"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70","Type":"ContainerStarted","Data":"9b049dbf7d92a19956b7c1b982c27716c540a695f61373117ddd95c6aa194cb5"} Oct 09 08:42:28 crc kubenswrapper[4715]: I1009 08:42:28.678593 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrl4" event={"ID":"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70","Type":"ContainerStarted","Data":"7713a3166ae43de6d60920bd91a74cc40b3337b550b6fffb1d78c6a3039c5914"} Oct 09 08:42:28 crc kubenswrapper[4715]: I1009 08:42:28.908271 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wvztr"] Oct 09 08:42:28 crc kubenswrapper[4715]: I1009 08:42:28.910712 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:28 crc kubenswrapper[4715]: I1009 08:42:28.919647 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvztr"] Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.074300 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-utilities\") pod \"redhat-marketplace-wvztr\" (UID: \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\") " pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.074630 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-catalog-content\") pod \"redhat-marketplace-wvztr\" (UID: \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\") " pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.074797 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7tfk\" (UniqueName: \"kubernetes.io/projected/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-kube-api-access-m7tfk\") pod \"redhat-marketplace-wvztr\" (UID: \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\") " pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.177940 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7tfk\" (UniqueName: \"kubernetes.io/projected/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-kube-api-access-m7tfk\") pod \"redhat-marketplace-wvztr\" (UID: \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\") " pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.178007 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-utilities\") pod \"redhat-marketplace-wvztr\" (UID: \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\") " pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.178041 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-catalog-content\") pod \"redhat-marketplace-wvztr\" (UID: \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\") " pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.178487 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-utilities\") pod \"redhat-marketplace-wvztr\" (UID: \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\") " pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.178545 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-catalog-content\") pod \"redhat-marketplace-wvztr\" (UID: \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\") " pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.202369 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7tfk\" (UniqueName: \"kubernetes.io/projected/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-kube-api-access-m7tfk\") pod \"redhat-marketplace-wvztr\" (UID: \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\") " pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.242029 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.694490 4715 generic.go:334] "Generic (PLEG): container finished" podID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerID="7713a3166ae43de6d60920bd91a74cc40b3337b550b6fffb1d78c6a3039c5914" exitCode=0 Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.694550 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrl4" event={"ID":"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70","Type":"ContainerDied","Data":"7713a3166ae43de6d60920bd91a74cc40b3337b550b6fffb1d78c6a3039c5914"} Oct 09 08:42:29 crc kubenswrapper[4715]: I1009 08:42:29.702896 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvztr"] Oct 09 08:42:30 crc kubenswrapper[4715]: I1009 08:42:30.704760 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvztr" event={"ID":"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa","Type":"ContainerStarted","Data":"8f4299696bd4e79d6ef7ffb9239e2a1962ea2c400355b95424b913e36d7cabe0"} Oct 09 08:42:30 crc kubenswrapper[4715]: I1009 08:42:30.705212 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvztr" event={"ID":"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa","Type":"ContainerStarted","Data":"227bfbdc00c7653701011323bb66f1f8307519bbea112333db49ee737cc4555c"} Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.309766 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nwtfj"] Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.312476 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.344709 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nwtfj"] Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.441721 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/090a2152-0f48-49f8-8e7f-6250c014eeaa-catalog-content\") pod \"community-operators-nwtfj\" (UID: \"090a2152-0f48-49f8-8e7f-6250c014eeaa\") " pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.441901 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx9pr\" (UniqueName: \"kubernetes.io/projected/090a2152-0f48-49f8-8e7f-6250c014eeaa-kube-api-access-zx9pr\") pod \"community-operators-nwtfj\" (UID: \"090a2152-0f48-49f8-8e7f-6250c014eeaa\") " pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.442015 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/090a2152-0f48-49f8-8e7f-6250c014eeaa-utilities\") pod \"community-operators-nwtfj\" (UID: \"090a2152-0f48-49f8-8e7f-6250c014eeaa\") " pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.544294 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/090a2152-0f48-49f8-8e7f-6250c014eeaa-catalog-content\") pod \"community-operators-nwtfj\" (UID: \"090a2152-0f48-49f8-8e7f-6250c014eeaa\") " pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.544362 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx9pr\" (UniqueName: \"kubernetes.io/projected/090a2152-0f48-49f8-8e7f-6250c014eeaa-kube-api-access-zx9pr\") pod \"community-operators-nwtfj\" (UID: \"090a2152-0f48-49f8-8e7f-6250c014eeaa\") " pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.544403 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/090a2152-0f48-49f8-8e7f-6250c014eeaa-utilities\") pod \"community-operators-nwtfj\" (UID: \"090a2152-0f48-49f8-8e7f-6250c014eeaa\") " pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.544873 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/090a2152-0f48-49f8-8e7f-6250c014eeaa-utilities\") pod \"community-operators-nwtfj\" (UID: \"090a2152-0f48-49f8-8e7f-6250c014eeaa\") " pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.544990 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/090a2152-0f48-49f8-8e7f-6250c014eeaa-catalog-content\") pod \"community-operators-nwtfj\" (UID: \"090a2152-0f48-49f8-8e7f-6250c014eeaa\") " pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.570285 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx9pr\" (UniqueName: \"kubernetes.io/projected/090a2152-0f48-49f8-8e7f-6250c014eeaa-kube-api-access-zx9pr\") pod \"community-operators-nwtfj\" (UID: \"090a2152-0f48-49f8-8e7f-6250c014eeaa\") " pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.646754 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.734126 4715 generic.go:334] "Generic (PLEG): container finished" podID="fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" containerID="8f4299696bd4e79d6ef7ffb9239e2a1962ea2c400355b95424b913e36d7cabe0" exitCode=0 Oct 09 08:42:31 crc kubenswrapper[4715]: I1009 08:42:31.734171 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvztr" event={"ID":"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa","Type":"ContainerDied","Data":"8f4299696bd4e79d6ef7ffb9239e2a1962ea2c400355b95424b913e36d7cabe0"} Oct 09 08:42:32 crc kubenswrapper[4715]: I1009 08:42:32.248018 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nwtfj"] Oct 09 08:42:32 crc kubenswrapper[4715]: W1009 08:42:32.248933 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod090a2152_0f48_49f8_8e7f_6250c014eeaa.slice/crio-969c73bc925dceb58b7588042fb6d2b5582a23cdd1cda82e90df7ea836fda6e0 WatchSource:0}: Error finding container 969c73bc925dceb58b7588042fb6d2b5582a23cdd1cda82e90df7ea836fda6e0: Status 404 returned error can't find the container with id 969c73bc925dceb58b7588042fb6d2b5582a23cdd1cda82e90df7ea836fda6e0 Oct 09 08:42:32 crc kubenswrapper[4715]: I1009 08:42:32.743247 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwtfj" event={"ID":"090a2152-0f48-49f8-8e7f-6250c014eeaa","Type":"ContainerStarted","Data":"84765c938c4542cb47c38415d0c68dc0edbae8891b208af5c4721dced7ac2a68"} Oct 09 08:42:32 crc kubenswrapper[4715]: I1009 08:42:32.743317 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwtfj" event={"ID":"090a2152-0f48-49f8-8e7f-6250c014eeaa","Type":"ContainerStarted","Data":"969c73bc925dceb58b7588042fb6d2b5582a23cdd1cda82e90df7ea836fda6e0"} Oct 09 08:42:32 crc kubenswrapper[4715]: I1009 08:42:32.745975 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrl4" event={"ID":"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70","Type":"ContainerStarted","Data":"9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4"} Oct 09 08:42:32 crc kubenswrapper[4715]: I1009 08:42:32.782680 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hwrl4" podStartSLOduration=4.084102816 podStartE2EDuration="6.782663545s" podCreationTimestamp="2025-10-09 08:42:26 +0000 UTC" firstStartedPulling="2025-10-09 08:42:27.661899425 +0000 UTC m=+3378.354703433" lastFinishedPulling="2025-10-09 08:42:30.360460154 +0000 UTC m=+3381.053264162" observedRunningTime="2025-10-09 08:42:32.779060122 +0000 UTC m=+3383.471864130" watchObservedRunningTime="2025-10-09 08:42:32.782663545 +0000 UTC m=+3383.475467553" Oct 09 08:42:33 crc kubenswrapper[4715]: I1009 08:42:33.756048 4715 generic.go:334] "Generic (PLEG): container finished" podID="090a2152-0f48-49f8-8e7f-6250c014eeaa" containerID="84765c938c4542cb47c38415d0c68dc0edbae8891b208af5c4721dced7ac2a68" exitCode=0 Oct 09 08:42:33 crc kubenswrapper[4715]: I1009 08:42:33.756143 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwtfj" event={"ID":"090a2152-0f48-49f8-8e7f-6250c014eeaa","Type":"ContainerDied","Data":"84765c938c4542cb47c38415d0c68dc0edbae8891b208af5c4721dced7ac2a68"} Oct 09 08:42:33 crc kubenswrapper[4715]: I1009 08:42:33.759131 4715 generic.go:334] "Generic (PLEG): container finished" podID="fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" containerID="babd3aa68cb9d47ef4c7fa6efe732013354d9968bb5c1341b0a1a22b1d3ad7d3" exitCode=0 Oct 09 08:42:33 crc kubenswrapper[4715]: I1009 08:42:33.759266 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvztr" event={"ID":"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa","Type":"ContainerDied","Data":"babd3aa68cb9d47ef4c7fa6efe732013354d9968bb5c1341b0a1a22b1d3ad7d3"} Oct 09 08:42:34 crc kubenswrapper[4715]: I1009 08:42:34.137349 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:42:34 crc kubenswrapper[4715]: E1009 08:42:34.137671 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:42:34 crc kubenswrapper[4715]: I1009 08:42:34.769941 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvztr" event={"ID":"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa","Type":"ContainerStarted","Data":"00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7"} Oct 09 08:42:34 crc kubenswrapper[4715]: I1009 08:42:34.771546 4715 generic.go:334] "Generic (PLEG): container finished" podID="090a2152-0f48-49f8-8e7f-6250c014eeaa" containerID="729a2245f9dea08b61953767c80051e6a5ed63f3575f9eb9c4ccec8589a0d63d" exitCode=0 Oct 09 08:42:34 crc kubenswrapper[4715]: I1009 08:42:34.771588 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwtfj" event={"ID":"090a2152-0f48-49f8-8e7f-6250c014eeaa","Type":"ContainerDied","Data":"729a2245f9dea08b61953767c80051e6a5ed63f3575f9eb9c4ccec8589a0d63d"} Oct 09 08:42:34 crc kubenswrapper[4715]: I1009 08:42:34.796711 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wvztr" podStartSLOduration=4.353967739 podStartE2EDuration="6.796694529s" podCreationTimestamp="2025-10-09 08:42:28 +0000 UTC" firstStartedPulling="2025-10-09 08:42:31.735746225 +0000 UTC m=+3382.428550233" lastFinishedPulling="2025-10-09 08:42:34.178473015 +0000 UTC m=+3384.871277023" observedRunningTime="2025-10-09 08:42:34.788402291 +0000 UTC m=+3385.481206299" watchObservedRunningTime="2025-10-09 08:42:34.796694529 +0000 UTC m=+3385.489498537" Oct 09 08:42:35 crc kubenswrapper[4715]: I1009 08:42:35.784264 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwtfj" event={"ID":"090a2152-0f48-49f8-8e7f-6250c014eeaa","Type":"ContainerStarted","Data":"152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77"} Oct 09 08:42:35 crc kubenswrapper[4715]: I1009 08:42:35.808086 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nwtfj" podStartSLOduration=3.350821468 podStartE2EDuration="4.80806514s" podCreationTimestamp="2025-10-09 08:42:31 +0000 UTC" firstStartedPulling="2025-10-09 08:42:33.759705912 +0000 UTC m=+3384.452509940" lastFinishedPulling="2025-10-09 08:42:35.216949604 +0000 UTC m=+3385.909753612" observedRunningTime="2025-10-09 08:42:35.799955758 +0000 UTC m=+3386.492759776" watchObservedRunningTime="2025-10-09 08:42:35.80806514 +0000 UTC m=+3386.500869148" Oct 09 08:42:36 crc kubenswrapper[4715]: I1009 08:42:36.872589 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:36 crc kubenswrapper[4715]: I1009 08:42:36.872676 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:42:37 crc kubenswrapper[4715]: I1009 08:42:37.931388 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hwrl4" podUID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerName="registry-server" probeResult="failure" output=< Oct 09 08:42:37 crc kubenswrapper[4715]: timeout: failed to connect service ":50051" within 1s Oct 09 08:42:37 crc kubenswrapper[4715]: > Oct 09 08:42:39 crc kubenswrapper[4715]: I1009 08:42:39.243110 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:39 crc kubenswrapper[4715]: I1009 08:42:39.243178 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:39 crc kubenswrapper[4715]: I1009 08:42:39.288571 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:39 crc kubenswrapper[4715]: I1009 08:42:39.873066 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:41 crc kubenswrapper[4715]: I1009 08:42:41.098560 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvztr"] Oct 09 08:42:41 crc kubenswrapper[4715]: I1009 08:42:41.647594 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:41 crc kubenswrapper[4715]: I1009 08:42:41.647977 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:41 crc kubenswrapper[4715]: I1009 08:42:41.699481 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:41 crc kubenswrapper[4715]: I1009 08:42:41.842034 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wvztr" podUID="fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" containerName="registry-server" containerID="cri-o://00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7" gracePeriod=2 Oct 09 08:42:41 crc kubenswrapper[4715]: I1009 08:42:41.890576 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.313054 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.457771 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-utilities\") pod \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\" (UID: \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\") " Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.457909 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7tfk\" (UniqueName: \"kubernetes.io/projected/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-kube-api-access-m7tfk\") pod \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\" (UID: \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\") " Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.458022 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-catalog-content\") pod \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\" (UID: \"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa\") " Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.459138 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-utilities" (OuterVolumeSpecName: "utilities") pod "fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" (UID: "fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.464312 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-kube-api-access-m7tfk" (OuterVolumeSpecName: "kube-api-access-m7tfk") pod "fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" (UID: "fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa"). InnerVolumeSpecName "kube-api-access-m7tfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.478096 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" (UID: "fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.561249 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7tfk\" (UniqueName: \"kubernetes.io/projected/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-kube-api-access-m7tfk\") on node \"crc\" DevicePath \"\"" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.561296 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.561308 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.854918 4715 generic.go:334] "Generic (PLEG): container finished" podID="fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" containerID="00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7" exitCode=0 Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.854990 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvztr" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.855013 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvztr" event={"ID":"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa","Type":"ContainerDied","Data":"00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7"} Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.855803 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvztr" event={"ID":"fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa","Type":"ContainerDied","Data":"227bfbdc00c7653701011323bb66f1f8307519bbea112333db49ee737cc4555c"} Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.855917 4715 scope.go:117] "RemoveContainer" containerID="00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.891854 4715 scope.go:117] "RemoveContainer" containerID="babd3aa68cb9d47ef4c7fa6efe732013354d9968bb5c1341b0a1a22b1d3ad7d3" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.902616 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvztr"] Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.914934 4715 scope.go:117] "RemoveContainer" containerID="8f4299696bd4e79d6ef7ffb9239e2a1962ea2c400355b95424b913e36d7cabe0" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.915306 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvztr"] Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.975229 4715 scope.go:117] "RemoveContainer" containerID="00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7" Oct 09 08:42:42 crc kubenswrapper[4715]: E1009 08:42:42.975829 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7\": container with ID starting with 00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7 not found: ID does not exist" containerID="00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.975860 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7"} err="failed to get container status \"00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7\": rpc error: code = NotFound desc = could not find container \"00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7\": container with ID starting with 00e1b205980a3142892e34551c0e1813c9445647567808e7d0035289e912c2c7 not found: ID does not exist" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.975882 4715 scope.go:117] "RemoveContainer" containerID="babd3aa68cb9d47ef4c7fa6efe732013354d9968bb5c1341b0a1a22b1d3ad7d3" Oct 09 08:42:42 crc kubenswrapper[4715]: E1009 08:42:42.976109 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babd3aa68cb9d47ef4c7fa6efe732013354d9968bb5c1341b0a1a22b1d3ad7d3\": container with ID starting with babd3aa68cb9d47ef4c7fa6efe732013354d9968bb5c1341b0a1a22b1d3ad7d3 not found: ID does not exist" containerID="babd3aa68cb9d47ef4c7fa6efe732013354d9968bb5c1341b0a1a22b1d3ad7d3" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.976130 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babd3aa68cb9d47ef4c7fa6efe732013354d9968bb5c1341b0a1a22b1d3ad7d3"} err="failed to get container status \"babd3aa68cb9d47ef4c7fa6efe732013354d9968bb5c1341b0a1a22b1d3ad7d3\": rpc error: code = NotFound desc = could not find container \"babd3aa68cb9d47ef4c7fa6efe732013354d9968bb5c1341b0a1a22b1d3ad7d3\": container with ID starting with babd3aa68cb9d47ef4c7fa6efe732013354d9968bb5c1341b0a1a22b1d3ad7d3 not found: ID does not exist" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.976141 4715 scope.go:117] "RemoveContainer" containerID="8f4299696bd4e79d6ef7ffb9239e2a1962ea2c400355b95424b913e36d7cabe0" Oct 09 08:42:42 crc kubenswrapper[4715]: E1009 08:42:42.976345 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f4299696bd4e79d6ef7ffb9239e2a1962ea2c400355b95424b913e36d7cabe0\": container with ID starting with 8f4299696bd4e79d6ef7ffb9239e2a1962ea2c400355b95424b913e36d7cabe0 not found: ID does not exist" containerID="8f4299696bd4e79d6ef7ffb9239e2a1962ea2c400355b95424b913e36d7cabe0" Oct 09 08:42:42 crc kubenswrapper[4715]: I1009 08:42:42.976364 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4299696bd4e79d6ef7ffb9239e2a1962ea2c400355b95424b913e36d7cabe0"} err="failed to get container status \"8f4299696bd4e79d6ef7ffb9239e2a1962ea2c400355b95424b913e36d7cabe0\": rpc error: code = NotFound desc = could not find container \"8f4299696bd4e79d6ef7ffb9239e2a1962ea2c400355b95424b913e36d7cabe0\": container with ID starting with 8f4299696bd4e79d6ef7ffb9239e2a1962ea2c400355b95424b913e36d7cabe0 not found: ID does not exist" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.099616 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nwtfj"] Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.100222 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nwtfj" podUID="090a2152-0f48-49f8-8e7f-6250c014eeaa" containerName="registry-server" containerID="cri-o://152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77" gracePeriod=2 Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.153649 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" path="/var/lib/kubelet/pods/fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa/volumes" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.612249 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.696831 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/090a2152-0f48-49f8-8e7f-6250c014eeaa-utilities\") pod \"090a2152-0f48-49f8-8e7f-6250c014eeaa\" (UID: \"090a2152-0f48-49f8-8e7f-6250c014eeaa\") " Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.696921 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/090a2152-0f48-49f8-8e7f-6250c014eeaa-catalog-content\") pod \"090a2152-0f48-49f8-8e7f-6250c014eeaa\" (UID: \"090a2152-0f48-49f8-8e7f-6250c014eeaa\") " Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.696970 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx9pr\" (UniqueName: \"kubernetes.io/projected/090a2152-0f48-49f8-8e7f-6250c014eeaa-kube-api-access-zx9pr\") pod \"090a2152-0f48-49f8-8e7f-6250c014eeaa\" (UID: \"090a2152-0f48-49f8-8e7f-6250c014eeaa\") " Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.697602 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090a2152-0f48-49f8-8e7f-6250c014eeaa-utilities" (OuterVolumeSpecName: "utilities") pod "090a2152-0f48-49f8-8e7f-6250c014eeaa" (UID: "090a2152-0f48-49f8-8e7f-6250c014eeaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.703792 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/090a2152-0f48-49f8-8e7f-6250c014eeaa-kube-api-access-zx9pr" (OuterVolumeSpecName: "kube-api-access-zx9pr") pod "090a2152-0f48-49f8-8e7f-6250c014eeaa" (UID: "090a2152-0f48-49f8-8e7f-6250c014eeaa"). InnerVolumeSpecName "kube-api-access-zx9pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.759737 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090a2152-0f48-49f8-8e7f-6250c014eeaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "090a2152-0f48-49f8-8e7f-6250c014eeaa" (UID: "090a2152-0f48-49f8-8e7f-6250c014eeaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.799276 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/090a2152-0f48-49f8-8e7f-6250c014eeaa-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.799311 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/090a2152-0f48-49f8-8e7f-6250c014eeaa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.799325 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx9pr\" (UniqueName: \"kubernetes.io/projected/090a2152-0f48-49f8-8e7f-6250c014eeaa-kube-api-access-zx9pr\") on node \"crc\" DevicePath \"\"" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.877343 4715 generic.go:334] "Generic (PLEG): container finished" podID="090a2152-0f48-49f8-8e7f-6250c014eeaa" containerID="152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77" exitCode=0 Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.877387 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwtfj" event={"ID":"090a2152-0f48-49f8-8e7f-6250c014eeaa","Type":"ContainerDied","Data":"152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77"} Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.877434 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwtfj" event={"ID":"090a2152-0f48-49f8-8e7f-6250c014eeaa","Type":"ContainerDied","Data":"969c73bc925dceb58b7588042fb6d2b5582a23cdd1cda82e90df7ea836fda6e0"} Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.877446 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwtfj" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.877456 4715 scope.go:117] "RemoveContainer" containerID="152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.896403 4715 scope.go:117] "RemoveContainer" containerID="729a2245f9dea08b61953767c80051e6a5ed63f3575f9eb9c4ccec8589a0d63d" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.915453 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nwtfj"] Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.920671 4715 scope.go:117] "RemoveContainer" containerID="84765c938c4542cb47c38415d0c68dc0edbae8891b208af5c4721dced7ac2a68" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.924552 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nwtfj"] Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.966987 4715 scope.go:117] "RemoveContainer" containerID="152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77" Oct 09 08:42:44 crc kubenswrapper[4715]: E1009 08:42:44.967727 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77\": container with ID starting with 152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77 not found: ID does not exist" containerID="152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.967774 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77"} err="failed to get container status \"152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77\": rpc error: code = NotFound desc = could not find container \"152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77\": container with ID starting with 152e2869fa4a2a9ef2d9beb350f63ad0c4c437e6a5618747257dc540fa52eb77 not found: ID does not exist" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.967807 4715 scope.go:117] "RemoveContainer" containerID="729a2245f9dea08b61953767c80051e6a5ed63f3575f9eb9c4ccec8589a0d63d" Oct 09 08:42:44 crc kubenswrapper[4715]: E1009 08:42:44.968309 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"729a2245f9dea08b61953767c80051e6a5ed63f3575f9eb9c4ccec8589a0d63d\": container with ID starting with 729a2245f9dea08b61953767c80051e6a5ed63f3575f9eb9c4ccec8589a0d63d not found: ID does not exist" containerID="729a2245f9dea08b61953767c80051e6a5ed63f3575f9eb9c4ccec8589a0d63d" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.968363 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"729a2245f9dea08b61953767c80051e6a5ed63f3575f9eb9c4ccec8589a0d63d"} err="failed to get container status \"729a2245f9dea08b61953767c80051e6a5ed63f3575f9eb9c4ccec8589a0d63d\": rpc error: code = NotFound desc = could not find container \"729a2245f9dea08b61953767c80051e6a5ed63f3575f9eb9c4ccec8589a0d63d\": container with ID starting with 729a2245f9dea08b61953767c80051e6a5ed63f3575f9eb9c4ccec8589a0d63d not found: ID does not exist" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.968396 4715 scope.go:117] "RemoveContainer" containerID="84765c938c4542cb47c38415d0c68dc0edbae8891b208af5c4721dced7ac2a68" Oct 09 08:42:44 crc kubenswrapper[4715]: E1009 08:42:44.968906 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84765c938c4542cb47c38415d0c68dc0edbae8891b208af5c4721dced7ac2a68\": container with ID starting with 84765c938c4542cb47c38415d0c68dc0edbae8891b208af5c4721dced7ac2a68 not found: ID does not exist" containerID="84765c938c4542cb47c38415d0c68dc0edbae8891b208af5c4721dced7ac2a68" Oct 09 08:42:44 crc kubenswrapper[4715]: I1009 08:42:44.968973 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84765c938c4542cb47c38415d0c68dc0edbae8891b208af5c4721dced7ac2a68"} err="failed to get container status \"84765c938c4542cb47c38415d0c68dc0edbae8891b208af5c4721dced7ac2a68\": rpc error: code = NotFound desc = could not find container \"84765c938c4542cb47c38415d0c68dc0edbae8891b208af5c4721dced7ac2a68\": container with ID starting with 84765c938c4542cb47c38415d0c68dc0edbae8891b208af5c4721dced7ac2a68 not found: ID does not exist" Oct 09 08:42:46 crc kubenswrapper[4715]: I1009 08:42:46.146549 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="090a2152-0f48-49f8-8e7f-6250c014eeaa" path="/var/lib/kubelet/pods/090a2152-0f48-49f8-8e7f-6250c014eeaa/volumes" Oct 09 08:42:47 crc kubenswrapper[4715]: I1009 08:42:47.136686 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:42:47 crc kubenswrapper[4715]: I1009 08:42:47.905985 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"0987e6a342e02824b43b9834c2fc79356564b0a0d9274ff128b433f6ae2be4e5"} Oct 09 08:42:47 crc kubenswrapper[4715]: I1009 08:42:47.958936 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hwrl4" podUID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerName="registry-server" probeResult="failure" output=< Oct 09 08:42:47 crc kubenswrapper[4715]: timeout: failed to connect service ":50051" within 1s Oct 09 08:42:47 crc kubenswrapper[4715]: > Oct 09 08:42:57 crc kubenswrapper[4715]: I1009 08:42:57.958379 4715 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hwrl4" podUID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerName="registry-server" probeResult="failure" output=< Oct 09 08:42:57 crc kubenswrapper[4715]: timeout: failed to connect service ":50051" within 1s Oct 09 08:42:57 crc kubenswrapper[4715]: > Oct 09 08:43:06 crc kubenswrapper[4715]: I1009 08:43:06.935988 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:43:06 crc kubenswrapper[4715]: I1009 08:43:06.994015 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:43:07 crc kubenswrapper[4715]: I1009 08:43:07.172287 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hwrl4"] Oct 09 08:43:08 crc kubenswrapper[4715]: I1009 08:43:08.087242 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hwrl4" podUID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerName="registry-server" containerID="cri-o://9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4" gracePeriod=2 Oct 09 08:43:08 crc kubenswrapper[4715]: I1009 08:43:08.612388 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:43:08 crc kubenswrapper[4715]: I1009 08:43:08.769845 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmzrb\" (UniqueName: \"kubernetes.io/projected/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-kube-api-access-tmzrb\") pod \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\" (UID: \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\") " Oct 09 08:43:08 crc kubenswrapper[4715]: I1009 08:43:08.769945 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-catalog-content\") pod \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\" (UID: \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\") " Oct 09 08:43:08 crc kubenswrapper[4715]: I1009 08:43:08.770066 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-utilities\") pod \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\" (UID: \"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70\") " Oct 09 08:43:08 crc kubenswrapper[4715]: I1009 08:43:08.770686 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-utilities" (OuterVolumeSpecName: "utilities") pod "36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" (UID: "36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:43:08 crc kubenswrapper[4715]: I1009 08:43:08.776665 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-kube-api-access-tmzrb" (OuterVolumeSpecName: "kube-api-access-tmzrb") pod "36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" (UID: "36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70"). InnerVolumeSpecName "kube-api-access-tmzrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:43:08 crc kubenswrapper[4715]: I1009 08:43:08.853743 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" (UID: "36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:43:08 crc kubenswrapper[4715]: I1009 08:43:08.872898 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:43:08 crc kubenswrapper[4715]: I1009 08:43:08.872960 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmzrb\" (UniqueName: \"kubernetes.io/projected/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-kube-api-access-tmzrb\") on node \"crc\" DevicePath \"\"" Oct 09 08:43:08 crc kubenswrapper[4715]: I1009 08:43:08.872974 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.098104 4715 generic.go:334] "Generic (PLEG): container finished" podID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerID="9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4" exitCode=0 Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.098141 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrl4" event={"ID":"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70","Type":"ContainerDied","Data":"9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4"} Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.098176 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwrl4" event={"ID":"36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70","Type":"ContainerDied","Data":"9b049dbf7d92a19956b7c1b982c27716c540a695f61373117ddd95c6aa194cb5"} Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.098178 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwrl4" Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.098194 4715 scope.go:117] "RemoveContainer" containerID="9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4" Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.127567 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hwrl4"] Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.128270 4715 scope.go:117] "RemoveContainer" containerID="7713a3166ae43de6d60920bd91a74cc40b3337b550b6fffb1d78c6a3039c5914" Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.134876 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hwrl4"] Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.151795 4715 scope.go:117] "RemoveContainer" containerID="f0c41b35384af1afee6d52d84a7b8158c680364acd0a7aa349081f90263fc6dd" Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.204215 4715 scope.go:117] "RemoveContainer" containerID="9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4" Oct 09 08:43:09 crc kubenswrapper[4715]: E1009 08:43:09.204711 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4\": container with ID starting with 9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4 not found: ID does not exist" containerID="9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4" Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.204758 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4"} err="failed to get container status \"9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4\": rpc error: code = NotFound desc = could not find container \"9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4\": container with ID starting with 9f739e442783e76d14c2d8b97227b71cba6d83507ca51d3bad3aa68860f6b7f4 not found: ID does not exist" Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.204785 4715 scope.go:117] "RemoveContainer" containerID="7713a3166ae43de6d60920bd91a74cc40b3337b550b6fffb1d78c6a3039c5914" Oct 09 08:43:09 crc kubenswrapper[4715]: E1009 08:43:09.205387 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7713a3166ae43de6d60920bd91a74cc40b3337b550b6fffb1d78c6a3039c5914\": container with ID starting with 7713a3166ae43de6d60920bd91a74cc40b3337b550b6fffb1d78c6a3039c5914 not found: ID does not exist" containerID="7713a3166ae43de6d60920bd91a74cc40b3337b550b6fffb1d78c6a3039c5914" Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.205413 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7713a3166ae43de6d60920bd91a74cc40b3337b550b6fffb1d78c6a3039c5914"} err="failed to get container status \"7713a3166ae43de6d60920bd91a74cc40b3337b550b6fffb1d78c6a3039c5914\": rpc error: code = NotFound desc = could not find container \"7713a3166ae43de6d60920bd91a74cc40b3337b550b6fffb1d78c6a3039c5914\": container with ID starting with 7713a3166ae43de6d60920bd91a74cc40b3337b550b6fffb1d78c6a3039c5914 not found: ID does not exist" Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.205438 4715 scope.go:117] "RemoveContainer" containerID="f0c41b35384af1afee6d52d84a7b8158c680364acd0a7aa349081f90263fc6dd" Oct 09 08:43:09 crc kubenswrapper[4715]: E1009 08:43:09.205668 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0c41b35384af1afee6d52d84a7b8158c680364acd0a7aa349081f90263fc6dd\": container with ID starting with f0c41b35384af1afee6d52d84a7b8158c680364acd0a7aa349081f90263fc6dd not found: ID does not exist" containerID="f0c41b35384af1afee6d52d84a7b8158c680364acd0a7aa349081f90263fc6dd" Oct 09 08:43:09 crc kubenswrapper[4715]: I1009 08:43:09.205698 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0c41b35384af1afee6d52d84a7b8158c680364acd0a7aa349081f90263fc6dd"} err="failed to get container status \"f0c41b35384af1afee6d52d84a7b8158c680364acd0a7aa349081f90263fc6dd\": rpc error: code = NotFound desc = could not find container \"f0c41b35384af1afee6d52d84a7b8158c680364acd0a7aa349081f90263fc6dd\": container with ID starting with f0c41b35384af1afee6d52d84a7b8158c680364acd0a7aa349081f90263fc6dd not found: ID does not exist" Oct 09 08:43:10 crc kubenswrapper[4715]: I1009 08:43:10.163794 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" path="/var/lib/kubelet/pods/36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70/volumes" Oct 09 08:43:51 crc kubenswrapper[4715]: I1009 08:43:51.515169 4715 generic.go:334] "Generic (PLEG): container finished" podID="c272fa72-6434-4af1-8e2b-433cc9f619ea" containerID="f8971272e21a1b1f813c9a3eecd73a61b075311309e74532b5899bc79cb3ba3b" exitCode=0 Oct 09 08:43:51 crc kubenswrapper[4715]: I1009 08:43:51.515286 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c272fa72-6434-4af1-8e2b-433cc9f619ea","Type":"ContainerDied","Data":"f8971272e21a1b1f813c9a3eecd73a61b075311309e74532b5899bc79cb3ba3b"} Oct 09 08:43:52 crc kubenswrapper[4715]: I1009 08:43:52.886774 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.043105 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c272fa72-6434-4af1-8e2b-433cc9f619ea-config-data\") pod \"c272fa72-6434-4af1-8e2b-433cc9f619ea\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.043196 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c272fa72-6434-4af1-8e2b-433cc9f619ea-test-operator-ephemeral-temporary\") pod \"c272fa72-6434-4af1-8e2b-433cc9f619ea\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.043258 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"c272fa72-6434-4af1-8e2b-433cc9f619ea\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.043321 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-ssh-key\") pod \"c272fa72-6434-4af1-8e2b-433cc9f619ea\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.043339 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v8g8\" (UniqueName: \"kubernetes.io/projected/c272fa72-6434-4af1-8e2b-433cc9f619ea-kube-api-access-4v8g8\") pod \"c272fa72-6434-4af1-8e2b-433cc9f619ea\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.043366 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c272fa72-6434-4af1-8e2b-433cc9f619ea-test-operator-ephemeral-workdir\") pod \"c272fa72-6434-4af1-8e2b-433cc9f619ea\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.043458 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c272fa72-6434-4af1-8e2b-433cc9f619ea-openstack-config\") pod \"c272fa72-6434-4af1-8e2b-433cc9f619ea\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.043485 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-ca-certs\") pod \"c272fa72-6434-4af1-8e2b-433cc9f619ea\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.043535 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-openstack-config-secret\") pod \"c272fa72-6434-4af1-8e2b-433cc9f619ea\" (UID: \"c272fa72-6434-4af1-8e2b-433cc9f619ea\") " Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.043886 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c272fa72-6434-4af1-8e2b-433cc9f619ea-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "c272fa72-6434-4af1-8e2b-433cc9f619ea" (UID: "c272fa72-6434-4af1-8e2b-433cc9f619ea"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.044724 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c272fa72-6434-4af1-8e2b-433cc9f619ea-config-data" (OuterVolumeSpecName: "config-data") pod "c272fa72-6434-4af1-8e2b-433cc9f619ea" (UID: "c272fa72-6434-4af1-8e2b-433cc9f619ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.045192 4715 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c272fa72-6434-4af1-8e2b-433cc9f619ea-config-data\") on node \"crc\" DevicePath \"\"" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.045220 4715 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/c272fa72-6434-4af1-8e2b-433cc9f619ea-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.050226 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "c272fa72-6434-4af1-8e2b-433cc9f619ea" (UID: "c272fa72-6434-4af1-8e2b-433cc9f619ea"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.050690 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c272fa72-6434-4af1-8e2b-433cc9f619ea-kube-api-access-4v8g8" (OuterVolumeSpecName: "kube-api-access-4v8g8") pod "c272fa72-6434-4af1-8e2b-433cc9f619ea" (UID: "c272fa72-6434-4af1-8e2b-433cc9f619ea"). InnerVolumeSpecName "kube-api-access-4v8g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.057859 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c272fa72-6434-4af1-8e2b-433cc9f619ea-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "c272fa72-6434-4af1-8e2b-433cc9f619ea" (UID: "c272fa72-6434-4af1-8e2b-433cc9f619ea"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.073225 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c272fa72-6434-4af1-8e2b-433cc9f619ea" (UID: "c272fa72-6434-4af1-8e2b-433cc9f619ea"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.075316 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "c272fa72-6434-4af1-8e2b-433cc9f619ea" (UID: "c272fa72-6434-4af1-8e2b-433cc9f619ea"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.076608 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c272fa72-6434-4af1-8e2b-433cc9f619ea" (UID: "c272fa72-6434-4af1-8e2b-433cc9f619ea"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.106169 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c272fa72-6434-4af1-8e2b-433cc9f619ea-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c272fa72-6434-4af1-8e2b-433cc9f619ea" (UID: "c272fa72-6434-4af1-8e2b-433cc9f619ea"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.146748 4715 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.147607 4715 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.147651 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v8g8\" (UniqueName: \"kubernetes.io/projected/c272fa72-6434-4af1-8e2b-433cc9f619ea-kube-api-access-4v8g8\") on node \"crc\" DevicePath \"\"" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.147667 4715 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/c272fa72-6434-4af1-8e2b-433cc9f619ea-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.147683 4715 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c272fa72-6434-4af1-8e2b-433cc9f619ea-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.147695 4715 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.147704 4715 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c272fa72-6434-4af1-8e2b-433cc9f619ea-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.173627 4715 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.249648 4715 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.531897 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"c272fa72-6434-4af1-8e2b-433cc9f619ea","Type":"ContainerDied","Data":"3ee2dd16e78180b189ad7487103cae4c4000cb0d818a33b6877994fc50811dbc"} Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.532224 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ee2dd16e78180b189ad7487103cae4c4000cb0d818a33b6877994fc50811dbc" Oct 09 08:43:53 crc kubenswrapper[4715]: I1009 08:43:53.531965 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.508035 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 08:43:59 crc kubenswrapper[4715]: E1009 08:43:59.510416 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" containerName="extract-content" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.510637 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" containerName="extract-content" Oct 09 08:43:59 crc kubenswrapper[4715]: E1009 08:43:59.510769 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerName="registry-server" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.510879 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerName="registry-server" Oct 09 08:43:59 crc kubenswrapper[4715]: E1009 08:43:59.511006 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090a2152-0f48-49f8-8e7f-6250c014eeaa" containerName="extract-content" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.511108 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="090a2152-0f48-49f8-8e7f-6250c014eeaa" containerName="extract-content" Oct 09 08:43:59 crc kubenswrapper[4715]: E1009 08:43:59.511220 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" containerName="extract-utilities" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.511329 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" containerName="extract-utilities" Oct 09 08:43:59 crc kubenswrapper[4715]: E1009 08:43:59.511489 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerName="extract-utilities" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.511613 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerName="extract-utilities" Oct 09 08:43:59 crc kubenswrapper[4715]: E1009 08:43:59.511706 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090a2152-0f48-49f8-8e7f-6250c014eeaa" containerName="extract-utilities" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.511812 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="090a2152-0f48-49f8-8e7f-6250c014eeaa" containerName="extract-utilities" Oct 09 08:43:59 crc kubenswrapper[4715]: E1009 08:43:59.511906 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c272fa72-6434-4af1-8e2b-433cc9f619ea" containerName="tempest-tests-tempest-tests-runner" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.511987 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="c272fa72-6434-4af1-8e2b-433cc9f619ea" containerName="tempest-tests-tempest-tests-runner" Oct 09 08:43:59 crc kubenswrapper[4715]: E1009 08:43:59.512090 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" containerName="registry-server" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.512167 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" containerName="registry-server" Oct 09 08:43:59 crc kubenswrapper[4715]: E1009 08:43:59.512254 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerName="extract-content" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.512331 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerName="extract-content" Oct 09 08:43:59 crc kubenswrapper[4715]: E1009 08:43:59.512438 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090a2152-0f48-49f8-8e7f-6250c014eeaa" containerName="registry-server" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.512568 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="090a2152-0f48-49f8-8e7f-6250c014eeaa" containerName="registry-server" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.512983 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="fda744cf-8c51-4082-bdf4-9bf8ea7fb3aa" containerName="registry-server" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.513117 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="c272fa72-6434-4af1-8e2b-433cc9f619ea" containerName="tempest-tests-tempest-tests-runner" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.513208 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="090a2152-0f48-49f8-8e7f-6250c014eeaa" containerName="registry-server" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.513304 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a9a2b9-f2f8-45c2-9fab-ac04ccc68b70" containerName="registry-server" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.514174 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.517187 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.520410 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-z66mg" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.680743 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77885\" (UniqueName: \"kubernetes.io/projected/e6af6cd1-b4d1-4521-a954-460e613c51e1-kube-api-access-77885\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6af6cd1-b4d1-4521-a954-460e613c51e1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.681250 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6af6cd1-b4d1-4521-a954-460e613c51e1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.783729 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77885\" (UniqueName: \"kubernetes.io/projected/e6af6cd1-b4d1-4521-a954-460e613c51e1-kube-api-access-77885\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6af6cd1-b4d1-4521-a954-460e613c51e1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.784240 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6af6cd1-b4d1-4521-a954-460e613c51e1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.784560 4715 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6af6cd1-b4d1-4521-a954-460e613c51e1\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.805945 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77885\" (UniqueName: \"kubernetes.io/projected/e6af6cd1-b4d1-4521-a954-460e613c51e1-kube-api-access-77885\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6af6cd1-b4d1-4521-a954-460e613c51e1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.811018 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6af6cd1-b4d1-4521-a954-460e613c51e1\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 08:43:59 crc kubenswrapper[4715]: I1009 08:43:59.842206 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 09 08:44:00 crc kubenswrapper[4715]: I1009 08:44:00.316513 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 09 08:44:00 crc kubenswrapper[4715]: I1009 08:44:00.616395 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e6af6cd1-b4d1-4521-a954-460e613c51e1","Type":"ContainerStarted","Data":"dfbb3617fb1540f636418beb734ceef6f84a9acc4ff1fdc76099a2c0186d1847"} Oct 09 08:44:02 crc kubenswrapper[4715]: I1009 08:44:02.643245 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e6af6cd1-b4d1-4521-a954-460e613c51e1","Type":"ContainerStarted","Data":"9bf9161752f1f2dfabb6e493b2538e39117b0a95dea218dd3d2f530b65c2a4a2"} Oct 09 08:44:02 crc kubenswrapper[4715]: I1009 08:44:02.660677 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.414696776 podStartE2EDuration="3.660659147s" podCreationTimestamp="2025-10-09 08:43:59 +0000 UTC" firstStartedPulling="2025-10-09 08:44:00.322683292 +0000 UTC m=+3471.015487300" lastFinishedPulling="2025-10-09 08:44:01.568645633 +0000 UTC m=+3472.261449671" observedRunningTime="2025-10-09 08:44:02.659789502 +0000 UTC m=+3473.352593550" watchObservedRunningTime="2025-10-09 08:44:02.660659147 +0000 UTC m=+3473.353463155" Oct 09 08:44:18 crc kubenswrapper[4715]: I1009 08:44:18.533875 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cp26h/must-gather-hqxds"] Oct 09 08:44:18 crc kubenswrapper[4715]: I1009 08:44:18.539510 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/must-gather-hqxds" Oct 09 08:44:18 crc kubenswrapper[4715]: I1009 08:44:18.548697 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cp26h"/"kube-root-ca.crt" Oct 09 08:44:18 crc kubenswrapper[4715]: I1009 08:44:18.548977 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cp26h"/"openshift-service-ca.crt" Oct 09 08:44:18 crc kubenswrapper[4715]: I1009 08:44:18.566378 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cp26h/must-gather-hqxds"] Oct 09 08:44:18 crc kubenswrapper[4715]: I1009 08:44:18.704773 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e4bcd81-50d9-4f8c-90e6-fa6cba352454-must-gather-output\") pod \"must-gather-hqxds\" (UID: \"8e4bcd81-50d9-4f8c-90e6-fa6cba352454\") " pod="openshift-must-gather-cp26h/must-gather-hqxds" Oct 09 08:44:18 crc kubenswrapper[4715]: I1009 08:44:18.704843 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5tl7\" (UniqueName: \"kubernetes.io/projected/8e4bcd81-50d9-4f8c-90e6-fa6cba352454-kube-api-access-c5tl7\") pod \"must-gather-hqxds\" (UID: \"8e4bcd81-50d9-4f8c-90e6-fa6cba352454\") " pod="openshift-must-gather-cp26h/must-gather-hqxds" Oct 09 08:44:18 crc kubenswrapper[4715]: I1009 08:44:18.806759 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e4bcd81-50d9-4f8c-90e6-fa6cba352454-must-gather-output\") pod \"must-gather-hqxds\" (UID: \"8e4bcd81-50d9-4f8c-90e6-fa6cba352454\") " pod="openshift-must-gather-cp26h/must-gather-hqxds" Oct 09 08:44:18 crc kubenswrapper[4715]: I1009 08:44:18.806807 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5tl7\" (UniqueName: \"kubernetes.io/projected/8e4bcd81-50d9-4f8c-90e6-fa6cba352454-kube-api-access-c5tl7\") pod \"must-gather-hqxds\" (UID: \"8e4bcd81-50d9-4f8c-90e6-fa6cba352454\") " pod="openshift-must-gather-cp26h/must-gather-hqxds" Oct 09 08:44:18 crc kubenswrapper[4715]: I1009 08:44:18.807185 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e4bcd81-50d9-4f8c-90e6-fa6cba352454-must-gather-output\") pod \"must-gather-hqxds\" (UID: \"8e4bcd81-50d9-4f8c-90e6-fa6cba352454\") " pod="openshift-must-gather-cp26h/must-gather-hqxds" Oct 09 08:44:18 crc kubenswrapper[4715]: I1009 08:44:18.825269 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5tl7\" (UniqueName: \"kubernetes.io/projected/8e4bcd81-50d9-4f8c-90e6-fa6cba352454-kube-api-access-c5tl7\") pod \"must-gather-hqxds\" (UID: \"8e4bcd81-50d9-4f8c-90e6-fa6cba352454\") " pod="openshift-must-gather-cp26h/must-gather-hqxds" Oct 09 08:44:18 crc kubenswrapper[4715]: I1009 08:44:18.887702 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/must-gather-hqxds" Oct 09 08:44:19 crc kubenswrapper[4715]: I1009 08:44:19.361037 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cp26h/must-gather-hqxds"] Oct 09 08:44:19 crc kubenswrapper[4715]: I1009 08:44:19.815259 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp26h/must-gather-hqxds" event={"ID":"8e4bcd81-50d9-4f8c-90e6-fa6cba352454","Type":"ContainerStarted","Data":"5dd0e3a28d15a8a36dd4eafd2b499b85c9c1816a8aad6341e2eac67b817f174f"} Oct 09 08:44:24 crc kubenswrapper[4715]: I1009 08:44:24.873339 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp26h/must-gather-hqxds" event={"ID":"8e4bcd81-50d9-4f8c-90e6-fa6cba352454","Type":"ContainerStarted","Data":"b1eddd0e29e41dfafdb41e18d0ef3e133ed6f18795dc0edaafeb72b978305573"} Oct 09 08:44:24 crc kubenswrapper[4715]: I1009 08:44:24.874189 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp26h/must-gather-hqxds" event={"ID":"8e4bcd81-50d9-4f8c-90e6-fa6cba352454","Type":"ContainerStarted","Data":"8af00bf2a950f5533dc93fd0cd37d6f99600af8b94e7acc36dd85da6c2984268"} Oct 09 08:44:24 crc kubenswrapper[4715]: I1009 08:44:24.897090 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cp26h/must-gather-hqxds" podStartSLOduration=2.447543177 podStartE2EDuration="6.897075202s" podCreationTimestamp="2025-10-09 08:44:18 +0000 UTC" firstStartedPulling="2025-10-09 08:44:19.366992081 +0000 UTC m=+3490.059796089" lastFinishedPulling="2025-10-09 08:44:23.816524106 +0000 UTC m=+3494.509328114" observedRunningTime="2025-10-09 08:44:24.894430046 +0000 UTC m=+3495.587234054" watchObservedRunningTime="2025-10-09 08:44:24.897075202 +0000 UTC m=+3495.589879210" Oct 09 08:44:27 crc kubenswrapper[4715]: I1009 08:44:27.610107 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cp26h/crc-debug-k7c7r"] Oct 09 08:44:27 crc kubenswrapper[4715]: I1009 08:44:27.611617 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/crc-debug-k7c7r" Oct 09 08:44:27 crc kubenswrapper[4715]: I1009 08:44:27.619342 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-cp26h"/"default-dockercfg-5hf92" Oct 09 08:44:27 crc kubenswrapper[4715]: I1009 08:44:27.784239 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6j4c\" (UniqueName: \"kubernetes.io/projected/c0712111-0b3b-47fb-9c39-f59828f8f6c9-kube-api-access-t6j4c\") pod \"crc-debug-k7c7r\" (UID: \"c0712111-0b3b-47fb-9c39-f59828f8f6c9\") " pod="openshift-must-gather-cp26h/crc-debug-k7c7r" Oct 09 08:44:27 crc kubenswrapper[4715]: I1009 08:44:27.784344 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0712111-0b3b-47fb-9c39-f59828f8f6c9-host\") pod \"crc-debug-k7c7r\" (UID: \"c0712111-0b3b-47fb-9c39-f59828f8f6c9\") " pod="openshift-must-gather-cp26h/crc-debug-k7c7r" Oct 09 08:44:27 crc kubenswrapper[4715]: I1009 08:44:27.886104 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6j4c\" (UniqueName: \"kubernetes.io/projected/c0712111-0b3b-47fb-9c39-f59828f8f6c9-kube-api-access-t6j4c\") pod \"crc-debug-k7c7r\" (UID: \"c0712111-0b3b-47fb-9c39-f59828f8f6c9\") " pod="openshift-must-gather-cp26h/crc-debug-k7c7r" Oct 09 08:44:27 crc kubenswrapper[4715]: I1009 08:44:27.886187 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0712111-0b3b-47fb-9c39-f59828f8f6c9-host\") pod \"crc-debug-k7c7r\" (UID: \"c0712111-0b3b-47fb-9c39-f59828f8f6c9\") " pod="openshift-must-gather-cp26h/crc-debug-k7c7r" Oct 09 08:44:27 crc kubenswrapper[4715]: I1009 08:44:27.886389 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0712111-0b3b-47fb-9c39-f59828f8f6c9-host\") pod \"crc-debug-k7c7r\" (UID: \"c0712111-0b3b-47fb-9c39-f59828f8f6c9\") " pod="openshift-must-gather-cp26h/crc-debug-k7c7r" Oct 09 08:44:27 crc kubenswrapper[4715]: I1009 08:44:27.909240 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6j4c\" (UniqueName: \"kubernetes.io/projected/c0712111-0b3b-47fb-9c39-f59828f8f6c9-kube-api-access-t6j4c\") pod \"crc-debug-k7c7r\" (UID: \"c0712111-0b3b-47fb-9c39-f59828f8f6c9\") " pod="openshift-must-gather-cp26h/crc-debug-k7c7r" Oct 09 08:44:27 crc kubenswrapper[4715]: I1009 08:44:27.931269 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/crc-debug-k7c7r" Oct 09 08:44:28 crc kubenswrapper[4715]: I1009 08:44:28.904525 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp26h/crc-debug-k7c7r" event={"ID":"c0712111-0b3b-47fb-9c39-f59828f8f6c9","Type":"ContainerStarted","Data":"68dbcfaf9530dcfc1caaf578cdacaf34df83728c3097970e08b9ed6203755e3e"} Oct 09 08:44:40 crc kubenswrapper[4715]: I1009 08:44:40.004774 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp26h/crc-debug-k7c7r" event={"ID":"c0712111-0b3b-47fb-9c39-f59828f8f6c9","Type":"ContainerStarted","Data":"b372beb77a8ff199d16ee51610cb3281d79d89143ac305b6c5f88ec41b440cba"} Oct 09 08:44:40 crc kubenswrapper[4715]: I1009 08:44:40.018737 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cp26h/crc-debug-k7c7r" podStartSLOduration=1.749926732 podStartE2EDuration="13.018715799s" podCreationTimestamp="2025-10-09 08:44:27 +0000 UTC" firstStartedPulling="2025-10-09 08:44:27.960044094 +0000 UTC m=+3498.652848102" lastFinishedPulling="2025-10-09 08:44:39.228833161 +0000 UTC m=+3509.921637169" observedRunningTime="2025-10-09 08:44:40.017935077 +0000 UTC m=+3510.710739105" watchObservedRunningTime="2025-10-09 08:44:40.018715799 +0000 UTC m=+3510.711519817" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.200722 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm"] Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.202495 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.205552 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.205566 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.234420 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm"] Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.246635 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6v26\" (UniqueName: \"kubernetes.io/projected/aee77b54-8f27-4b31-8f55-576a680ac0db-kube-api-access-s6v26\") pod \"collect-profiles-29333325-7shgm\" (UID: \"aee77b54-8f27-4b31-8f55-576a680ac0db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.246707 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee77b54-8f27-4b31-8f55-576a680ac0db-config-volume\") pod \"collect-profiles-29333325-7shgm\" (UID: \"aee77b54-8f27-4b31-8f55-576a680ac0db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.246738 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aee77b54-8f27-4b31-8f55-576a680ac0db-secret-volume\") pod \"collect-profiles-29333325-7shgm\" (UID: \"aee77b54-8f27-4b31-8f55-576a680ac0db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.348678 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6v26\" (UniqueName: \"kubernetes.io/projected/aee77b54-8f27-4b31-8f55-576a680ac0db-kube-api-access-s6v26\") pod \"collect-profiles-29333325-7shgm\" (UID: \"aee77b54-8f27-4b31-8f55-576a680ac0db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.348760 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee77b54-8f27-4b31-8f55-576a680ac0db-config-volume\") pod \"collect-profiles-29333325-7shgm\" (UID: \"aee77b54-8f27-4b31-8f55-576a680ac0db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.348799 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aee77b54-8f27-4b31-8f55-576a680ac0db-secret-volume\") pod \"collect-profiles-29333325-7shgm\" (UID: \"aee77b54-8f27-4b31-8f55-576a680ac0db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.350769 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee77b54-8f27-4b31-8f55-576a680ac0db-config-volume\") pod \"collect-profiles-29333325-7shgm\" (UID: \"aee77b54-8f27-4b31-8f55-576a680ac0db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.359777 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aee77b54-8f27-4b31-8f55-576a680ac0db-secret-volume\") pod \"collect-profiles-29333325-7shgm\" (UID: \"aee77b54-8f27-4b31-8f55-576a680ac0db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.373090 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6v26\" (UniqueName: \"kubernetes.io/projected/aee77b54-8f27-4b31-8f55-576a680ac0db-kube-api-access-s6v26\") pod \"collect-profiles-29333325-7shgm\" (UID: \"aee77b54-8f27-4b31-8f55-576a680ac0db\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:00 crc kubenswrapper[4715]: I1009 08:45:00.530879 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:01 crc kubenswrapper[4715]: I1009 08:45:01.039531 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm"] Oct 09 08:45:01 crc kubenswrapper[4715]: I1009 08:45:01.206453 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" event={"ID":"aee77b54-8f27-4b31-8f55-576a680ac0db","Type":"ContainerStarted","Data":"78af540b84c9169b1170102612f1472e3917744da8dd42ac07fbe64fd32701a7"} Oct 09 08:45:02 crc kubenswrapper[4715]: I1009 08:45:02.216843 4715 generic.go:334] "Generic (PLEG): container finished" podID="aee77b54-8f27-4b31-8f55-576a680ac0db" containerID="a4671784df1cb1bb6aeb68d926c33efc921a7c254e152351a924dac6224e2efb" exitCode=0 Oct 09 08:45:02 crc kubenswrapper[4715]: I1009 08:45:02.217055 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" event={"ID":"aee77b54-8f27-4b31-8f55-576a680ac0db","Type":"ContainerDied","Data":"a4671784df1cb1bb6aeb68d926c33efc921a7c254e152351a924dac6224e2efb"} Oct 09 08:45:03 crc kubenswrapper[4715]: I1009 08:45:03.680747 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:03 crc kubenswrapper[4715]: I1009 08:45:03.720259 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6v26\" (UniqueName: \"kubernetes.io/projected/aee77b54-8f27-4b31-8f55-576a680ac0db-kube-api-access-s6v26\") pod \"aee77b54-8f27-4b31-8f55-576a680ac0db\" (UID: \"aee77b54-8f27-4b31-8f55-576a680ac0db\") " Oct 09 08:45:03 crc kubenswrapper[4715]: I1009 08:45:03.720373 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aee77b54-8f27-4b31-8f55-576a680ac0db-secret-volume\") pod \"aee77b54-8f27-4b31-8f55-576a680ac0db\" (UID: \"aee77b54-8f27-4b31-8f55-576a680ac0db\") " Oct 09 08:45:03 crc kubenswrapper[4715]: I1009 08:45:03.720505 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee77b54-8f27-4b31-8f55-576a680ac0db-config-volume\") pod \"aee77b54-8f27-4b31-8f55-576a680ac0db\" (UID: \"aee77b54-8f27-4b31-8f55-576a680ac0db\") " Oct 09 08:45:03 crc kubenswrapper[4715]: I1009 08:45:03.722005 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee77b54-8f27-4b31-8f55-576a680ac0db-config-volume" (OuterVolumeSpecName: "config-volume") pod "aee77b54-8f27-4b31-8f55-576a680ac0db" (UID: "aee77b54-8f27-4b31-8f55-576a680ac0db"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 09 08:45:03 crc kubenswrapper[4715]: I1009 08:45:03.727004 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee77b54-8f27-4b31-8f55-576a680ac0db-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "aee77b54-8f27-4b31-8f55-576a680ac0db" (UID: "aee77b54-8f27-4b31-8f55-576a680ac0db"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 09 08:45:03 crc kubenswrapper[4715]: I1009 08:45:03.728463 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee77b54-8f27-4b31-8f55-576a680ac0db-kube-api-access-s6v26" (OuterVolumeSpecName: "kube-api-access-s6v26") pod "aee77b54-8f27-4b31-8f55-576a680ac0db" (UID: "aee77b54-8f27-4b31-8f55-576a680ac0db"). InnerVolumeSpecName "kube-api-access-s6v26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:45:03 crc kubenswrapper[4715]: I1009 08:45:03.823042 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6v26\" (UniqueName: \"kubernetes.io/projected/aee77b54-8f27-4b31-8f55-576a680ac0db-kube-api-access-s6v26\") on node \"crc\" DevicePath \"\"" Oct 09 08:45:03 crc kubenswrapper[4715]: I1009 08:45:03.823370 4715 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aee77b54-8f27-4b31-8f55-576a680ac0db-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 09 08:45:03 crc kubenswrapper[4715]: I1009 08:45:03.823382 4715 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aee77b54-8f27-4b31-8f55-576a680ac0db-config-volume\") on node \"crc\" DevicePath \"\"" Oct 09 08:45:04 crc kubenswrapper[4715]: I1009 08:45:04.247237 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" event={"ID":"aee77b54-8f27-4b31-8f55-576a680ac0db","Type":"ContainerDied","Data":"78af540b84c9169b1170102612f1472e3917744da8dd42ac07fbe64fd32701a7"} Oct 09 08:45:04 crc kubenswrapper[4715]: I1009 08:45:04.247284 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78af540b84c9169b1170102612f1472e3917744da8dd42ac07fbe64fd32701a7" Oct 09 08:45:04 crc kubenswrapper[4715]: I1009 08:45:04.247298 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29333325-7shgm" Oct 09 08:45:04 crc kubenswrapper[4715]: I1009 08:45:04.762074 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l"] Oct 09 08:45:04 crc kubenswrapper[4715]: I1009 08:45:04.769801 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29333280-zqp6l"] Oct 09 08:45:06 crc kubenswrapper[4715]: I1009 08:45:06.148061 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d35a0e3-8f99-408e-9f03-eb29d705d730" path="/var/lib/kubelet/pods/4d35a0e3-8f99-408e-9f03-eb29d705d730/volumes" Oct 09 08:45:14 crc kubenswrapper[4715]: I1009 08:45:14.811273 4715 scope.go:117] "RemoveContainer" containerID="bd8419cb68b37888d8ef166693a5caa4076baef592a79621b4ffcf8238f4694c" Oct 09 08:45:16 crc kubenswrapper[4715]: I1009 08:45:16.357865 4715 generic.go:334] "Generic (PLEG): container finished" podID="c0712111-0b3b-47fb-9c39-f59828f8f6c9" containerID="b372beb77a8ff199d16ee51610cb3281d79d89143ac305b6c5f88ec41b440cba" exitCode=0 Oct 09 08:45:16 crc kubenswrapper[4715]: I1009 08:45:16.357956 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp26h/crc-debug-k7c7r" event={"ID":"c0712111-0b3b-47fb-9c39-f59828f8f6c9","Type":"ContainerDied","Data":"b372beb77a8ff199d16ee51610cb3281d79d89143ac305b6c5f88ec41b440cba"} Oct 09 08:45:16 crc kubenswrapper[4715]: I1009 08:45:16.753538 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:45:16 crc kubenswrapper[4715]: I1009 08:45:16.753589 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:45:17 crc kubenswrapper[4715]: I1009 08:45:17.461885 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/crc-debug-k7c7r" Oct 09 08:45:17 crc kubenswrapper[4715]: I1009 08:45:17.496733 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cp26h/crc-debug-k7c7r"] Oct 09 08:45:17 crc kubenswrapper[4715]: I1009 08:45:17.506114 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cp26h/crc-debug-k7c7r"] Oct 09 08:45:17 crc kubenswrapper[4715]: I1009 08:45:17.586994 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6j4c\" (UniqueName: \"kubernetes.io/projected/c0712111-0b3b-47fb-9c39-f59828f8f6c9-kube-api-access-t6j4c\") pod \"c0712111-0b3b-47fb-9c39-f59828f8f6c9\" (UID: \"c0712111-0b3b-47fb-9c39-f59828f8f6c9\") " Oct 09 08:45:17 crc kubenswrapper[4715]: I1009 08:45:17.587103 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0712111-0b3b-47fb-9c39-f59828f8f6c9-host\") pod \"c0712111-0b3b-47fb-9c39-f59828f8f6c9\" (UID: \"c0712111-0b3b-47fb-9c39-f59828f8f6c9\") " Oct 09 08:45:17 crc kubenswrapper[4715]: I1009 08:45:17.587577 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0712111-0b3b-47fb-9c39-f59828f8f6c9-host" (OuterVolumeSpecName: "host") pod "c0712111-0b3b-47fb-9c39-f59828f8f6c9" (UID: "c0712111-0b3b-47fb-9c39-f59828f8f6c9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 08:45:17 crc kubenswrapper[4715]: I1009 08:45:17.587816 4715 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0712111-0b3b-47fb-9c39-f59828f8f6c9-host\") on node \"crc\" DevicePath \"\"" Oct 09 08:45:17 crc kubenswrapper[4715]: I1009 08:45:17.595114 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0712111-0b3b-47fb-9c39-f59828f8f6c9-kube-api-access-t6j4c" (OuterVolumeSpecName: "kube-api-access-t6j4c") pod "c0712111-0b3b-47fb-9c39-f59828f8f6c9" (UID: "c0712111-0b3b-47fb-9c39-f59828f8f6c9"). InnerVolumeSpecName "kube-api-access-t6j4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:45:17 crc kubenswrapper[4715]: I1009 08:45:17.689970 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6j4c\" (UniqueName: \"kubernetes.io/projected/c0712111-0b3b-47fb-9c39-f59828f8f6c9-kube-api-access-t6j4c\") on node \"crc\" DevicePath \"\"" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.147733 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0712111-0b3b-47fb-9c39-f59828f8f6c9" path="/var/lib/kubelet/pods/c0712111-0b3b-47fb-9c39-f59828f8f6c9/volumes" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.374804 4715 scope.go:117] "RemoveContainer" containerID="b372beb77a8ff199d16ee51610cb3281d79d89143ac305b6c5f88ec41b440cba" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.374972 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/crc-debug-k7c7r" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.672086 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cp26h/crc-debug-cbhwt"] Oct 09 08:45:18 crc kubenswrapper[4715]: E1009 08:45:18.681827 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee77b54-8f27-4b31-8f55-576a680ac0db" containerName="collect-profiles" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.681861 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee77b54-8f27-4b31-8f55-576a680ac0db" containerName="collect-profiles" Oct 09 08:45:18 crc kubenswrapper[4715]: E1009 08:45:18.681881 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0712111-0b3b-47fb-9c39-f59828f8f6c9" containerName="container-00" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.681887 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0712111-0b3b-47fb-9c39-f59828f8f6c9" containerName="container-00" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.682092 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee77b54-8f27-4b31-8f55-576a680ac0db" containerName="collect-profiles" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.682106 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0712111-0b3b-47fb-9c39-f59828f8f6c9" containerName="container-00" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.682812 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/crc-debug-cbhwt" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.685721 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-cp26h"/"default-dockercfg-5hf92" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.813849 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f5a6042-527f-4ff9-bb80-194fceb9baf7-host\") pod \"crc-debug-cbhwt\" (UID: \"7f5a6042-527f-4ff9-bb80-194fceb9baf7\") " pod="openshift-must-gather-cp26h/crc-debug-cbhwt" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.813908 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5x7b\" (UniqueName: \"kubernetes.io/projected/7f5a6042-527f-4ff9-bb80-194fceb9baf7-kube-api-access-m5x7b\") pod \"crc-debug-cbhwt\" (UID: \"7f5a6042-527f-4ff9-bb80-194fceb9baf7\") " pod="openshift-must-gather-cp26h/crc-debug-cbhwt" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.916212 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f5a6042-527f-4ff9-bb80-194fceb9baf7-host\") pod \"crc-debug-cbhwt\" (UID: \"7f5a6042-527f-4ff9-bb80-194fceb9baf7\") " pod="openshift-must-gather-cp26h/crc-debug-cbhwt" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.916273 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5x7b\" (UniqueName: \"kubernetes.io/projected/7f5a6042-527f-4ff9-bb80-194fceb9baf7-kube-api-access-m5x7b\") pod \"crc-debug-cbhwt\" (UID: \"7f5a6042-527f-4ff9-bb80-194fceb9baf7\") " pod="openshift-must-gather-cp26h/crc-debug-cbhwt" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.916353 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f5a6042-527f-4ff9-bb80-194fceb9baf7-host\") pod \"crc-debug-cbhwt\" (UID: \"7f5a6042-527f-4ff9-bb80-194fceb9baf7\") " pod="openshift-must-gather-cp26h/crc-debug-cbhwt" Oct 09 08:45:18 crc kubenswrapper[4715]: I1009 08:45:18.939129 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5x7b\" (UniqueName: \"kubernetes.io/projected/7f5a6042-527f-4ff9-bb80-194fceb9baf7-kube-api-access-m5x7b\") pod \"crc-debug-cbhwt\" (UID: \"7f5a6042-527f-4ff9-bb80-194fceb9baf7\") " pod="openshift-must-gather-cp26h/crc-debug-cbhwt" Oct 09 08:45:19 crc kubenswrapper[4715]: I1009 08:45:19.003273 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/crc-debug-cbhwt" Oct 09 08:45:19 crc kubenswrapper[4715]: I1009 08:45:19.385034 4715 generic.go:334] "Generic (PLEG): container finished" podID="7f5a6042-527f-4ff9-bb80-194fceb9baf7" containerID="82db4e19a3d6f670e81def34f878af114af8ebe6ea5f37ee12be3f0dbdea624d" exitCode=0 Oct 09 08:45:19 crc kubenswrapper[4715]: I1009 08:45:19.385115 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp26h/crc-debug-cbhwt" event={"ID":"7f5a6042-527f-4ff9-bb80-194fceb9baf7","Type":"ContainerDied","Data":"82db4e19a3d6f670e81def34f878af114af8ebe6ea5f37ee12be3f0dbdea624d"} Oct 09 08:45:19 crc kubenswrapper[4715]: I1009 08:45:19.385364 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp26h/crc-debug-cbhwt" event={"ID":"7f5a6042-527f-4ff9-bb80-194fceb9baf7","Type":"ContainerStarted","Data":"ec88cec04bee5580a505c53febddce46dd3818e3bf712abde8fb4273696b4386"} Oct 09 08:45:19 crc kubenswrapper[4715]: I1009 08:45:19.807513 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cp26h/crc-debug-cbhwt"] Oct 09 08:45:19 crc kubenswrapper[4715]: I1009 08:45:19.816972 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cp26h/crc-debug-cbhwt"] Oct 09 08:45:20 crc kubenswrapper[4715]: I1009 08:45:20.506101 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/crc-debug-cbhwt" Oct 09 08:45:20 crc kubenswrapper[4715]: I1009 08:45:20.647122 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5x7b\" (UniqueName: \"kubernetes.io/projected/7f5a6042-527f-4ff9-bb80-194fceb9baf7-kube-api-access-m5x7b\") pod \"7f5a6042-527f-4ff9-bb80-194fceb9baf7\" (UID: \"7f5a6042-527f-4ff9-bb80-194fceb9baf7\") " Oct 09 08:45:20 crc kubenswrapper[4715]: I1009 08:45:20.647253 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f5a6042-527f-4ff9-bb80-194fceb9baf7-host\") pod \"7f5a6042-527f-4ff9-bb80-194fceb9baf7\" (UID: \"7f5a6042-527f-4ff9-bb80-194fceb9baf7\") " Oct 09 08:45:20 crc kubenswrapper[4715]: I1009 08:45:20.648061 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f5a6042-527f-4ff9-bb80-194fceb9baf7-host" (OuterVolumeSpecName: "host") pod "7f5a6042-527f-4ff9-bb80-194fceb9baf7" (UID: "7f5a6042-527f-4ff9-bb80-194fceb9baf7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 08:45:20 crc kubenswrapper[4715]: I1009 08:45:20.653096 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f5a6042-527f-4ff9-bb80-194fceb9baf7-kube-api-access-m5x7b" (OuterVolumeSpecName: "kube-api-access-m5x7b") pod "7f5a6042-527f-4ff9-bb80-194fceb9baf7" (UID: "7f5a6042-527f-4ff9-bb80-194fceb9baf7"). InnerVolumeSpecName "kube-api-access-m5x7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:45:20 crc kubenswrapper[4715]: I1009 08:45:20.749898 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5x7b\" (UniqueName: \"kubernetes.io/projected/7f5a6042-527f-4ff9-bb80-194fceb9baf7-kube-api-access-m5x7b\") on node \"crc\" DevicePath \"\"" Oct 09 08:45:20 crc kubenswrapper[4715]: I1009 08:45:20.749936 4715 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f5a6042-527f-4ff9-bb80-194fceb9baf7-host\") on node \"crc\" DevicePath \"\"" Oct 09 08:45:20 crc kubenswrapper[4715]: I1009 08:45:20.992154 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cp26h/crc-debug-vxwnz"] Oct 09 08:45:20 crc kubenswrapper[4715]: E1009 08:45:20.993310 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5a6042-527f-4ff9-bb80-194fceb9baf7" containerName="container-00" Oct 09 08:45:20 crc kubenswrapper[4715]: I1009 08:45:20.993338 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5a6042-527f-4ff9-bb80-194fceb9baf7" containerName="container-00" Oct 09 08:45:20 crc kubenswrapper[4715]: I1009 08:45:20.993794 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f5a6042-527f-4ff9-bb80-194fceb9baf7" containerName="container-00" Oct 09 08:45:20 crc kubenswrapper[4715]: I1009 08:45:20.995106 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/crc-debug-vxwnz" Oct 09 08:45:21 crc kubenswrapper[4715]: I1009 08:45:21.055470 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thpsd\" (UniqueName: \"kubernetes.io/projected/f07a752a-d111-447d-98a3-bb52336d8779-kube-api-access-thpsd\") pod \"crc-debug-vxwnz\" (UID: \"f07a752a-d111-447d-98a3-bb52336d8779\") " pod="openshift-must-gather-cp26h/crc-debug-vxwnz" Oct 09 08:45:21 crc kubenswrapper[4715]: I1009 08:45:21.055579 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f07a752a-d111-447d-98a3-bb52336d8779-host\") pod \"crc-debug-vxwnz\" (UID: \"f07a752a-d111-447d-98a3-bb52336d8779\") " pod="openshift-must-gather-cp26h/crc-debug-vxwnz" Oct 09 08:45:21 crc kubenswrapper[4715]: I1009 08:45:21.158091 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thpsd\" (UniqueName: \"kubernetes.io/projected/f07a752a-d111-447d-98a3-bb52336d8779-kube-api-access-thpsd\") pod \"crc-debug-vxwnz\" (UID: \"f07a752a-d111-447d-98a3-bb52336d8779\") " pod="openshift-must-gather-cp26h/crc-debug-vxwnz" Oct 09 08:45:21 crc kubenswrapper[4715]: I1009 08:45:21.158618 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f07a752a-d111-447d-98a3-bb52336d8779-host\") pod \"crc-debug-vxwnz\" (UID: \"f07a752a-d111-447d-98a3-bb52336d8779\") " pod="openshift-must-gather-cp26h/crc-debug-vxwnz" Oct 09 08:45:21 crc kubenswrapper[4715]: I1009 08:45:21.158747 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f07a752a-d111-447d-98a3-bb52336d8779-host\") pod \"crc-debug-vxwnz\" (UID: \"f07a752a-d111-447d-98a3-bb52336d8779\") " pod="openshift-must-gather-cp26h/crc-debug-vxwnz" Oct 09 08:45:21 crc kubenswrapper[4715]: I1009 08:45:21.192305 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thpsd\" (UniqueName: \"kubernetes.io/projected/f07a752a-d111-447d-98a3-bb52336d8779-kube-api-access-thpsd\") pod \"crc-debug-vxwnz\" (UID: \"f07a752a-d111-447d-98a3-bb52336d8779\") " pod="openshift-must-gather-cp26h/crc-debug-vxwnz" Oct 09 08:45:21 crc kubenswrapper[4715]: I1009 08:45:21.320372 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/crc-debug-vxwnz" Oct 09 08:45:21 crc kubenswrapper[4715]: W1009 08:45:21.352384 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf07a752a_d111_447d_98a3_bb52336d8779.slice/crio-05398497a21ec139e049271cb85d06fab914ac2e4035bf02d1233a5626123410 WatchSource:0}: Error finding container 05398497a21ec139e049271cb85d06fab914ac2e4035bf02d1233a5626123410: Status 404 returned error can't find the container with id 05398497a21ec139e049271cb85d06fab914ac2e4035bf02d1233a5626123410 Oct 09 08:45:21 crc kubenswrapper[4715]: I1009 08:45:21.405928 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/crc-debug-cbhwt" Oct 09 08:45:21 crc kubenswrapper[4715]: I1009 08:45:21.405932 4715 scope.go:117] "RemoveContainer" containerID="82db4e19a3d6f670e81def34f878af114af8ebe6ea5f37ee12be3f0dbdea624d" Oct 09 08:45:21 crc kubenswrapper[4715]: I1009 08:45:21.407571 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp26h/crc-debug-vxwnz" event={"ID":"f07a752a-d111-447d-98a3-bb52336d8779","Type":"ContainerStarted","Data":"05398497a21ec139e049271cb85d06fab914ac2e4035bf02d1233a5626123410"} Oct 09 08:45:22 crc kubenswrapper[4715]: I1009 08:45:22.151187 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f5a6042-527f-4ff9-bb80-194fceb9baf7" path="/var/lib/kubelet/pods/7f5a6042-527f-4ff9-bb80-194fceb9baf7/volumes" Oct 09 08:45:22 crc kubenswrapper[4715]: I1009 08:45:22.422977 4715 generic.go:334] "Generic (PLEG): container finished" podID="f07a752a-d111-447d-98a3-bb52336d8779" containerID="340af6fe54530923e750ac2b4a0328d2d6a401776fb2c9f6a28d4fe905cb7d1c" exitCode=0 Oct 09 08:45:22 crc kubenswrapper[4715]: I1009 08:45:22.423035 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp26h/crc-debug-vxwnz" event={"ID":"f07a752a-d111-447d-98a3-bb52336d8779","Type":"ContainerDied","Data":"340af6fe54530923e750ac2b4a0328d2d6a401776fb2c9f6a28d4fe905cb7d1c"} Oct 09 08:45:22 crc kubenswrapper[4715]: I1009 08:45:22.466579 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cp26h/crc-debug-vxwnz"] Oct 09 08:45:22 crc kubenswrapper[4715]: I1009 08:45:22.477799 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cp26h/crc-debug-vxwnz"] Oct 09 08:45:23 crc kubenswrapper[4715]: I1009 08:45:23.552022 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/crc-debug-vxwnz" Oct 09 08:45:23 crc kubenswrapper[4715]: I1009 08:45:23.701913 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f07a752a-d111-447d-98a3-bb52336d8779-host\") pod \"f07a752a-d111-447d-98a3-bb52336d8779\" (UID: \"f07a752a-d111-447d-98a3-bb52336d8779\") " Oct 09 08:45:23 crc kubenswrapper[4715]: I1009 08:45:23.702243 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thpsd\" (UniqueName: \"kubernetes.io/projected/f07a752a-d111-447d-98a3-bb52336d8779-kube-api-access-thpsd\") pod \"f07a752a-d111-447d-98a3-bb52336d8779\" (UID: \"f07a752a-d111-447d-98a3-bb52336d8779\") " Oct 09 08:45:23 crc kubenswrapper[4715]: I1009 08:45:23.701995 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f07a752a-d111-447d-98a3-bb52336d8779-host" (OuterVolumeSpecName: "host") pod "f07a752a-d111-447d-98a3-bb52336d8779" (UID: "f07a752a-d111-447d-98a3-bb52336d8779"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 08:45:23 crc kubenswrapper[4715]: I1009 08:45:23.702777 4715 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f07a752a-d111-447d-98a3-bb52336d8779-host\") on node \"crc\" DevicePath \"\"" Oct 09 08:45:23 crc kubenswrapper[4715]: I1009 08:45:23.707310 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07a752a-d111-447d-98a3-bb52336d8779-kube-api-access-thpsd" (OuterVolumeSpecName: "kube-api-access-thpsd") pod "f07a752a-d111-447d-98a3-bb52336d8779" (UID: "f07a752a-d111-447d-98a3-bb52336d8779"). InnerVolumeSpecName "kube-api-access-thpsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:45:23 crc kubenswrapper[4715]: I1009 08:45:23.804873 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thpsd\" (UniqueName: \"kubernetes.io/projected/f07a752a-d111-447d-98a3-bb52336d8779-kube-api-access-thpsd\") on node \"crc\" DevicePath \"\"" Oct 09 08:45:24 crc kubenswrapper[4715]: I1009 08:45:24.150193 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f07a752a-d111-447d-98a3-bb52336d8779" path="/var/lib/kubelet/pods/f07a752a-d111-447d-98a3-bb52336d8779/volumes" Oct 09 08:45:24 crc kubenswrapper[4715]: I1009 08:45:24.444909 4715 scope.go:117] "RemoveContainer" containerID="340af6fe54530923e750ac2b4a0328d2d6a401776fb2c9f6a28d4fe905cb7d1c" Oct 09 08:45:24 crc kubenswrapper[4715]: I1009 08:45:24.444951 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/crc-debug-vxwnz" Oct 09 08:45:24 crc kubenswrapper[4715]: I1009 08:45:24.943674 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-67f46ddd46-qcgrk_cb469864-9053-4551-bca7-f3b67a20bf52/barbican-api/0.log" Oct 09 08:45:25 crc kubenswrapper[4715]: I1009 08:45:25.050406 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-67f46ddd46-qcgrk_cb469864-9053-4551-bca7-f3b67a20bf52/barbican-api-log/0.log" Oct 09 08:45:25 crc kubenswrapper[4715]: I1009 08:45:25.124163 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58cfd869c4-4djjf_50cf187d-781d-49b7-840b-8dfc3366135f/barbican-keystone-listener/0.log" Oct 09 08:45:25 crc kubenswrapper[4715]: I1009 08:45:25.196077 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58cfd869c4-4djjf_50cf187d-781d-49b7-840b-8dfc3366135f/barbican-keystone-listener-log/0.log" Oct 09 08:45:25 crc kubenswrapper[4715]: I1009 08:45:25.320145 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6ff4c65c6c-97dbp_6d84057b-c735-4df6-a20d-ef88cccb44fe/barbican-worker/0.log" Oct 09 08:45:25 crc kubenswrapper[4715]: I1009 08:45:25.351614 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6ff4c65c6c-97dbp_6d84057b-c735-4df6-a20d-ef88cccb44fe/barbican-worker-log/0.log" Oct 09 08:45:25 crc kubenswrapper[4715]: I1009 08:45:25.565770 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg_21e35629-c64d-4ef6-a570-7603aa8358fb/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:25 crc kubenswrapper[4715]: I1009 08:45:25.593681 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7574fa02-4d40-4d5f-8d52-3118db1c2e05/ceilometer-central-agent/0.log" Oct 09 08:45:25 crc kubenswrapper[4715]: I1009 08:45:25.650066 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7574fa02-4d40-4d5f-8d52-3118db1c2e05/ceilometer-notification-agent/0.log" Oct 09 08:45:25 crc kubenswrapper[4715]: I1009 08:45:25.747132 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7574fa02-4d40-4d5f-8d52-3118db1c2e05/sg-core/0.log" Oct 09 08:45:25 crc kubenswrapper[4715]: I1009 08:45:25.758843 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7574fa02-4d40-4d5f-8d52-3118db1c2e05/proxy-httpd/0.log" Oct 09 08:45:25 crc kubenswrapper[4715]: I1009 08:45:25.859660 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_50651334-64c9-4214-9c7b-c10c4152d053/cinder-api/0.log" Oct 09 08:45:25 crc kubenswrapper[4715]: I1009 08:45:25.955884 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_50651334-64c9-4214-9c7b-c10c4152d053/cinder-api-log/0.log" Oct 09 08:45:26 crc kubenswrapper[4715]: I1009 08:45:26.019751 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_48296f2f-dddc-4549-9f78-640128d54d46/cinder-scheduler/0.log" Oct 09 08:45:26 crc kubenswrapper[4715]: I1009 08:45:26.073305 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_48296f2f-dddc-4549-9f78-640128d54d46/probe/0.log" Oct 09 08:45:26 crc kubenswrapper[4715]: I1009 08:45:26.225316 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx_69340195-a6f5-4e04-823d-9d61548a14b9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:26 crc kubenswrapper[4715]: I1009 08:45:26.283928 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv_4220a571-fb05-4098-901b-a00a6c79efe8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:26 crc kubenswrapper[4715]: I1009 08:45:26.443234 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-6clgv_9d37f947-6e34-45b8-96a5-a18465d3f3fd/init/0.log" Oct 09 08:45:26 crc kubenswrapper[4715]: I1009 08:45:26.458508 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t_6d2dbb06-3154-428b-991c-09b567c9136e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:26 crc kubenswrapper[4715]: I1009 08:45:26.636680 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-6clgv_9d37f947-6e34-45b8-96a5-a18465d3f3fd/init/0.log" Oct 09 08:45:26 crc kubenswrapper[4715]: I1009 08:45:26.690450 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-6clgv_9d37f947-6e34-45b8-96a5-a18465d3f3fd/dnsmasq-dns/0.log" Oct 09 08:45:26 crc kubenswrapper[4715]: I1009 08:45:26.719379 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-hngw7_a796e5b4-3af9-4286-8e0f-44f5a026dc47/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:26 crc kubenswrapper[4715]: I1009 08:45:26.888770 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_351123b4-0e0c-413a-bc50-56f397c1b592/glance-httpd/0.log" Oct 09 08:45:26 crc kubenswrapper[4715]: I1009 08:45:26.901905 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_351123b4-0e0c-413a-bc50-56f397c1b592/glance-log/0.log" Oct 09 08:45:27 crc kubenswrapper[4715]: I1009 08:45:27.049945 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8be2de3b-f820-4674-992a-9bf1a1735d6b/glance-httpd/0.log" Oct 09 08:45:27 crc kubenswrapper[4715]: I1009 08:45:27.089582 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8be2de3b-f820-4674-992a-9bf1a1735d6b/glance-log/0.log" Oct 09 08:45:27 crc kubenswrapper[4715]: I1009 08:45:27.218278 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d788d6d48-5nczq_7b8b0665-2ab8-4fb9-93ff-6405324f24d5/horizon/0.log" Oct 09 08:45:27 crc kubenswrapper[4715]: I1009 08:45:27.387049 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz_34e4a21c-b0d6-448f-9fb9-42f65187fad8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:27 crc kubenswrapper[4715]: I1009 08:45:27.538877 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-7ps4g_bf57a3ae-e445-4a20-9bc1-c5c8480f9158/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:27 crc kubenswrapper[4715]: I1009 08:45:27.555207 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d788d6d48-5nczq_7b8b0665-2ab8-4fb9-93ff-6405324f24d5/horizon-log/0.log" Oct 09 08:45:27 crc kubenswrapper[4715]: I1009 08:45:27.784208 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f8c92457-ab26-4a48-b7e1-094eac8532c7/kube-state-metrics/0.log" Oct 09 08:45:27 crc kubenswrapper[4715]: I1009 08:45:27.847385 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76cc5dfd5-dcg58_914f4753-0cbe-4496-b703-8dd106c06db2/keystone-api/0.log" Oct 09 08:45:27 crc kubenswrapper[4715]: I1009 08:45:27.999080 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-whz64_767ac586-f48e-410c-a5bb-589eccbef2c8/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:28 crc kubenswrapper[4715]: I1009 08:45:28.377836 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85df7c4d7c-7ktz2_03dac8b3-a92c-49b7-94cd-f7ab774b7e65/neutron-api/0.log" Oct 09 08:45:28 crc kubenswrapper[4715]: I1009 08:45:28.408017 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85df7c4d7c-7ktz2_03dac8b3-a92c-49b7-94cd-f7ab774b7e65/neutron-httpd/0.log" Oct 09 08:45:28 crc kubenswrapper[4715]: I1009 08:45:28.502385 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq_1b4c9589-4c0f-4f07-86d8-4573a0e80292/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:29 crc kubenswrapper[4715]: I1009 08:45:29.012450 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b0f42f3a-98f5-442f-a169-3d7080e5fea3/nova-cell0-conductor-conductor/0.log" Oct 09 08:45:29 crc kubenswrapper[4715]: I1009 08:45:29.030314 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0c807bba-12d5-4e33-894d-7f1ae6faa077/nova-api-log/0.log" Oct 09 08:45:29 crc kubenswrapper[4715]: I1009 08:45:29.197764 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0c807bba-12d5-4e33-894d-7f1ae6faa077/nova-api-api/0.log" Oct 09 08:45:29 crc kubenswrapper[4715]: I1009 08:45:29.332252 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f2c8b523-67fd-40d3-9e2b-eb68619f60bc/nova-cell1-conductor-conductor/0.log" Oct 09 08:45:29 crc kubenswrapper[4715]: I1009 08:45:29.353453 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_84ca6d97-8374-4d3e-a5e7-e475bd7f89ce/nova-cell1-novncproxy-novncproxy/0.log" Oct 09 08:45:29 crc kubenswrapper[4715]: I1009 08:45:29.500659 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-sb8x7_7de82685-bfc0-41e5-81db-cadda0dc8d65/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:29 crc kubenswrapper[4715]: I1009 08:45:29.689729 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_76afd031-2e1d-412c-a21b-08e597e8eb83/nova-metadata-log/0.log" Oct 09 08:45:29 crc kubenswrapper[4715]: I1009 08:45:29.926217 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e9fdd59d-3bb3-4f3f-85af-316ddc7de166/nova-scheduler-scheduler/0.log" Oct 09 08:45:29 crc kubenswrapper[4715]: I1009 08:45:29.983566 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_72f969cd-b504-4db1-832a-1e0c7f0a3b7b/mysql-bootstrap/0.log" Oct 09 08:45:30 crc kubenswrapper[4715]: I1009 08:45:30.128989 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_72f969cd-b504-4db1-832a-1e0c7f0a3b7b/mysql-bootstrap/0.log" Oct 09 08:45:30 crc kubenswrapper[4715]: I1009 08:45:30.160065 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_72f969cd-b504-4db1-832a-1e0c7f0a3b7b/galera/0.log" Oct 09 08:45:30 crc kubenswrapper[4715]: I1009 08:45:30.330825 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_895dedfd-5a74-43a4-81d1-6365aa67ed6a/mysql-bootstrap/0.log" Oct 09 08:45:30 crc kubenswrapper[4715]: I1009 08:45:30.463812 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_895dedfd-5a74-43a4-81d1-6365aa67ed6a/mysql-bootstrap/0.log" Oct 09 08:45:30 crc kubenswrapper[4715]: I1009 08:45:30.530094 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_895dedfd-5a74-43a4-81d1-6365aa67ed6a/galera/0.log" Oct 09 08:45:30 crc kubenswrapper[4715]: I1009 08:45:30.694744 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ea37012b-c593-4cd0-8501-121c791b2741/openstackclient/0.log" Oct 09 08:45:30 crc kubenswrapper[4715]: I1009 08:45:30.778518 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ptkjm_f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2/openstack-network-exporter/0.log" Oct 09 08:45:30 crc kubenswrapper[4715]: I1009 08:45:30.815117 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_76afd031-2e1d-412c-a21b-08e597e8eb83/nova-metadata-metadata/0.log" Oct 09 08:45:30 crc kubenswrapper[4715]: I1009 08:45:30.971671 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gckmt_2db0914e-e011-4b76-a07d-57ce73faceaa/ovsdb-server-init/0.log" Oct 09 08:45:31 crc kubenswrapper[4715]: I1009 08:45:31.170757 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gckmt_2db0914e-e011-4b76-a07d-57ce73faceaa/ovs-vswitchd/0.log" Oct 09 08:45:31 crc kubenswrapper[4715]: I1009 08:45:31.183102 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gckmt_2db0914e-e011-4b76-a07d-57ce73faceaa/ovsdb-server-init/0.log" Oct 09 08:45:31 crc kubenswrapper[4715]: I1009 08:45:31.210363 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gckmt_2db0914e-e011-4b76-a07d-57ce73faceaa/ovsdb-server/0.log" Oct 09 08:45:31 crc kubenswrapper[4715]: I1009 08:45:31.407282 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xfr2w_2a1f06ac-c4c8-4884-bb1a-360fbaf03adf/ovn-controller/0.log" Oct 09 08:45:31 crc kubenswrapper[4715]: I1009 08:45:31.515140 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-czrvv_a6388a30-80c6-412d-9c3d-9b555b215d76/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:31 crc kubenswrapper[4715]: I1009 08:45:31.622891 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2d08550c-3489-4838-9aaf-49ecbcac005b/openstack-network-exporter/0.log" Oct 09 08:45:31 crc kubenswrapper[4715]: I1009 08:45:31.653449 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2d08550c-3489-4838-9aaf-49ecbcac005b/ovn-northd/0.log" Oct 09 08:45:31 crc kubenswrapper[4715]: I1009 08:45:31.751740 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ec6c7ac4-3535-4d45-9be1-8f6b4de9670f/openstack-network-exporter/0.log" Oct 09 08:45:31 crc kubenswrapper[4715]: I1009 08:45:31.796556 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ec6c7ac4-3535-4d45-9be1-8f6b4de9670f/ovsdbserver-nb/0.log" Oct 09 08:45:31 crc kubenswrapper[4715]: I1009 08:45:31.997712 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dc229de4-d184-450e-805b-a8b616c8a60b/ovsdbserver-sb/0.log" Oct 09 08:45:32 crc kubenswrapper[4715]: I1009 08:45:32.002239 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dc229de4-d184-450e-805b-a8b616c8a60b/openstack-network-exporter/0.log" Oct 09 08:45:32 crc kubenswrapper[4715]: I1009 08:45:32.190819 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6985f6958d-wwgg5_45254b90-e09e-425e-b7c2-123813b82b37/placement-api/0.log" Oct 09 08:45:32 crc kubenswrapper[4715]: I1009 08:45:32.274950 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a2bc3ad0-34e4-4ccc-9abd-7e998940780c/setup-container/0.log" Oct 09 08:45:32 crc kubenswrapper[4715]: I1009 08:45:32.276767 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6985f6958d-wwgg5_45254b90-e09e-425e-b7c2-123813b82b37/placement-log/0.log" Oct 09 08:45:32 crc kubenswrapper[4715]: I1009 08:45:32.475676 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a2bc3ad0-34e4-4ccc-9abd-7e998940780c/rabbitmq/0.log" Oct 09 08:45:32 crc kubenswrapper[4715]: I1009 08:45:32.522011 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a2bc3ad0-34e4-4ccc-9abd-7e998940780c/setup-container/0.log" Oct 09 08:45:32 crc kubenswrapper[4715]: I1009 08:45:32.536239 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c87001f3-a098-449a-b8ec-cccb2a313d5f/setup-container/0.log" Oct 09 08:45:32 crc kubenswrapper[4715]: I1009 08:45:32.719648 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c87001f3-a098-449a-b8ec-cccb2a313d5f/setup-container/0.log" Oct 09 08:45:32 crc kubenswrapper[4715]: I1009 08:45:32.762732 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c87001f3-a098-449a-b8ec-cccb2a313d5f/rabbitmq/0.log" Oct 09 08:45:32 crc kubenswrapper[4715]: I1009 08:45:32.764444 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc_f0ea0eb1-5091-4178-8ae8-a39cf494915d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:32 crc kubenswrapper[4715]: I1009 08:45:32.950941 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-bzxls_aa76af72-003a-4682-bbee-0ef470ecef9a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:33 crc kubenswrapper[4715]: I1009 08:45:33.003566 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc_1058ab3e-4f39-48ac-9f7e-81e40f041264/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:33 crc kubenswrapper[4715]: I1009 08:45:33.122166 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-qhl65_806359bf-f132-4db6-9795-4e180be1895a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:33 crc kubenswrapper[4715]: I1009 08:45:33.252473 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dpkqg_0986c120-0684-4916-b753-677c2d3e6798/ssh-known-hosts-edpm-deployment/0.log" Oct 09 08:45:33 crc kubenswrapper[4715]: I1009 08:45:33.491068 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9d9d4b647-cfdjf_6f4b9cb5-f128-44e3-9142-2d39d79cb0b8/proxy-server/0.log" Oct 09 08:45:33 crc kubenswrapper[4715]: I1009 08:45:33.587140 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9d9d4b647-cfdjf_6f4b9cb5-f128-44e3-9142-2d39d79cb0b8/proxy-httpd/0.log" Oct 09 08:45:33 crc kubenswrapper[4715]: I1009 08:45:33.619634 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-gjkhs_377b8455-bb97-4be8-977a-191578be267c/swift-ring-rebalance/0.log" Oct 09 08:45:33 crc kubenswrapper[4715]: I1009 08:45:33.706620 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/account-auditor/0.log" Oct 09 08:45:33 crc kubenswrapper[4715]: I1009 08:45:33.795131 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/account-reaper/0.log" Oct 09 08:45:33 crc kubenswrapper[4715]: I1009 08:45:33.837174 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/account-replicator/0.log" Oct 09 08:45:33 crc kubenswrapper[4715]: I1009 08:45:33.961526 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/container-auditor/0.log" Oct 09 08:45:33 crc kubenswrapper[4715]: I1009 08:45:33.974316 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/account-server/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.027496 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/container-replicator/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.080744 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/container-server/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.159028 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/container-updater/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.178736 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/object-auditor/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.283781 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/object-replicator/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.313159 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/object-expirer/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.326558 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/object-server/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.387997 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/object-updater/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.469795 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/rsync/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.500930 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/swift-recon-cron/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.675151 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4_943912d2-23f1-4cc8-92ab-42288a195416/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.799440 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_c272fa72-6434-4af1-8e2b-433cc9f619ea/tempest-tests-tempest-tests-runner/0.log" Oct 09 08:45:34 crc kubenswrapper[4715]: I1009 08:45:34.878829 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e6af6cd1-b4d1-4521-a954-460e613c51e1/test-operator-logs-container/0.log" Oct 09 08:45:35 crc kubenswrapper[4715]: I1009 08:45:35.159863 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-ddnss_3487ef30-efc9-46c5-8ed3-8146c9498ff0/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:45:46 crc kubenswrapper[4715]: I1009 08:45:46.753126 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:45:46 crc kubenswrapper[4715]: I1009 08:45:46.753644 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:45:47 crc kubenswrapper[4715]: I1009 08:45:47.046970 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5d9f0338-4450-49ca-ad02-67cdda5d323f/memcached/0.log" Oct 09 08:45:57 crc kubenswrapper[4715]: I1009 08:45:57.691853 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/util/0.log" Oct 09 08:45:57 crc kubenswrapper[4715]: I1009 08:45:57.833834 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/pull/0.log" Oct 09 08:45:57 crc kubenswrapper[4715]: I1009 08:45:57.842296 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/pull/0.log" Oct 09 08:45:57 crc kubenswrapper[4715]: I1009 08:45:57.845303 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/util/0.log" Oct 09 08:45:57 crc kubenswrapper[4715]: I1009 08:45:57.987514 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/pull/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.000309 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/extract/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.005873 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/util/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.172658 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-rsthg_e4603d13-cf9d-4d8d-82db-3b182aa42e74/kube-rbac-proxy/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.214870 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-gqdw4_68110204-494d-4a10-b25d-0996c9dd1c6f/kube-rbac-proxy/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.224829 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-rsthg_e4603d13-cf9d-4d8d-82db-3b182aa42e74/manager/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.375542 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-gqdw4_68110204-494d-4a10-b25d-0996c9dd1c6f/manager/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.397091 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-v8zt5_675a8b37-dcfc-414e-9218-7741ce9ec2d5/kube-rbac-proxy/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.415732 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-v8zt5_675a8b37-dcfc-414e-9218-7741ce9ec2d5/manager/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.600686 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-cfkg2_e11fc796-233e-4c17-b953-1c6211f0c679/kube-rbac-proxy/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.625627 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-cfkg2_e11fc796-233e-4c17-b953-1c6211f0c679/manager/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.735823 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-qjwrk_32b6325f-e041-492d-a113-638dcef15310/kube-rbac-proxy/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.757708 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-qjwrk_32b6325f-e041-492d-a113-638dcef15310/manager/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.823068 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-jps2w_2730bf5c-42b9-4739-a2bc-6250bfcb997a/kube-rbac-proxy/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.900087 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-jps2w_2730bf5c-42b9-4739-a2bc-6250bfcb997a/manager/0.log" Oct 09 08:45:58 crc kubenswrapper[4715]: I1009 08:45:58.991352 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-ckx8h_b39f3e52-f97a-4bf4-934d-88267bddae91/kube-rbac-proxy/0.log" Oct 09 08:45:59 crc kubenswrapper[4715]: I1009 08:45:59.167024 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-ckx8h_b39f3e52-f97a-4bf4-934d-88267bddae91/manager/0.log" Oct 09 08:45:59 crc kubenswrapper[4715]: I1009 08:45:59.195684 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-pxmc4_c990a4aa-4a8e-499b-bf58-99c469af523e/kube-rbac-proxy/0.log" Oct 09 08:45:59 crc kubenswrapper[4715]: I1009 08:45:59.203819 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-pxmc4_c990a4aa-4a8e-499b-bf58-99c469af523e/manager/0.log" Oct 09 08:45:59 crc kubenswrapper[4715]: I1009 08:45:59.357356 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-7zmwl_9657b932-fe63-4417-8463-8af21e9c9790/kube-rbac-proxy/0.log" Oct 09 08:45:59 crc kubenswrapper[4715]: I1009 08:45:59.433655 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-7zmwl_9657b932-fe63-4417-8463-8af21e9c9790/manager/0.log" Oct 09 08:45:59 crc kubenswrapper[4715]: I1009 08:45:59.534365 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-6zczp_9cae911a-5b69-4cf4-aa26-4adb4457eec4/kube-rbac-proxy/0.log" Oct 09 08:45:59 crc kubenswrapper[4715]: I1009 08:45:59.590330 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-ggwkb_ad178d55-a5d5-40b5-9364-0a9af0718f46/kube-rbac-proxy/0.log" Oct 09 08:45:59 crc kubenswrapper[4715]: I1009 08:45:59.592548 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-6zczp_9cae911a-5b69-4cf4-aa26-4adb4457eec4/manager/0.log" Oct 09 08:45:59 crc kubenswrapper[4715]: I1009 08:45:59.726060 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-ggwkb_ad178d55-a5d5-40b5-9364-0a9af0718f46/manager/0.log" Oct 09 08:45:59 crc kubenswrapper[4715]: I1009 08:45:59.781292 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-kqhg2_ecf88dec-957f-4221-8ded-d779392c2793/kube-rbac-proxy/0.log" Oct 09 08:45:59 crc kubenswrapper[4715]: I1009 08:45:59.827079 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-kqhg2_ecf88dec-957f-4221-8ded-d779392c2793/manager/0.log" Oct 09 08:45:59 crc kubenswrapper[4715]: I1009 08:45:59.967976 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-6xd27_6d11d372-6981-432f-a2b0-364cb9b24f63/kube-rbac-proxy/0.log" Oct 09 08:46:00 crc kubenswrapper[4715]: I1009 08:46:00.046068 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-6xd27_6d11d372-6981-432f-a2b0-364cb9b24f63/manager/0.log" Oct 09 08:46:00 crc kubenswrapper[4715]: I1009 08:46:00.098736 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-xmq4r_619ad411-d5d7-431b-9bb6-6cf084134aaf/kube-rbac-proxy/0.log" Oct 09 08:46:00 crc kubenswrapper[4715]: I1009 08:46:00.177038 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-xmq4r_619ad411-d5d7-431b-9bb6-6cf084134aaf/manager/0.log" Oct 09 08:46:00 crc kubenswrapper[4715]: I1009 08:46:00.236567 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll_e9335457-1cad-453a-9539-d73dc2c77021/kube-rbac-proxy/0.log" Oct 09 08:46:00 crc kubenswrapper[4715]: I1009 08:46:00.316787 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll_e9335457-1cad-453a-9539-d73dc2c77021/manager/0.log" Oct 09 08:46:00 crc kubenswrapper[4715]: I1009 08:46:00.440505 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b6844c9b7-z7vpp_fd59cd6f-8b57-4377-80ae-a1873494f103/kube-rbac-proxy/0.log" Oct 09 08:46:00 crc kubenswrapper[4715]: I1009 08:46:00.641380 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-75c7986888-vmsr4_2f70ba87-a4dd-4a97-a005-f63fec497e9f/kube-rbac-proxy/0.log" Oct 09 08:46:00 crc kubenswrapper[4715]: I1009 08:46:00.883806 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fgdkg_9809a400-b7e4-4700-bfb6-3500d2f61c96/registry-server/0.log" Oct 09 08:46:00 crc kubenswrapper[4715]: I1009 08:46:00.919151 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-75c7986888-vmsr4_2f70ba87-a4dd-4a97-a005-f63fec497e9f/operator/0.log" Oct 09 08:46:01 crc kubenswrapper[4715]: I1009 08:46:01.111190 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-5plxn_6d1ea812-36f3-4478-9b78-aed194390313/kube-rbac-proxy/0.log" Oct 09 08:46:01 crc kubenswrapper[4715]: I1009 08:46:01.180102 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-5plxn_6d1ea812-36f3-4478-9b78-aed194390313/manager/0.log" Oct 09 08:46:01 crc kubenswrapper[4715]: I1009 08:46:01.308364 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-xwt7q_fc37f3a9-94a5-4957-939a-a0b0a7a567bb/kube-rbac-proxy/0.log" Oct 09 08:46:01 crc kubenswrapper[4715]: I1009 08:46:01.343467 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-xwt7q_fc37f3a9-94a5-4957-939a-a0b0a7a567bb/manager/0.log" Oct 09 08:46:01 crc kubenswrapper[4715]: I1009 08:46:01.508285 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s_48619024-da5f-4b28-8724-3707961de8ce/operator/0.log" Oct 09 08:46:01 crc kubenswrapper[4715]: I1009 08:46:01.582074 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-rcm5h_4b8010cb-d8af-4b7c-9530-fe143bbf1ddb/kube-rbac-proxy/0.log" Oct 09 08:46:01 crc kubenswrapper[4715]: I1009 08:46:01.626260 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-rcm5h_4b8010cb-d8af-4b7c-9530-fe143bbf1ddb/manager/0.log" Oct 09 08:46:01 crc kubenswrapper[4715]: I1009 08:46:01.677920 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b6844c9b7-z7vpp_fd59cd6f-8b57-4377-80ae-a1873494f103/manager/0.log" Oct 09 08:46:01 crc kubenswrapper[4715]: I1009 08:46:01.806000 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6648b66598-cvsqm_9b402da9-cbb2-473b-beee-7064e06acb73/kube-rbac-proxy/0.log" Oct 09 08:46:01 crc kubenswrapper[4715]: I1009 08:46:01.832544 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6648b66598-cvsqm_9b402da9-cbb2-473b-beee-7064e06acb73/manager/0.log" Oct 09 08:46:01 crc kubenswrapper[4715]: I1009 08:46:01.886435 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-rt6nt_ba259ec1-9157-4cd9-8c21-11915efe5dde/kube-rbac-proxy/0.log" Oct 09 08:46:01 crc kubenswrapper[4715]: I1009 08:46:01.933561 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-rt6nt_ba259ec1-9157-4cd9-8c21-11915efe5dde/manager/0.log" Oct 09 08:46:02 crc kubenswrapper[4715]: I1009 08:46:02.007378 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-bgwc8_666e7073-bf77-46f3-99da-5ad2013835a9/kube-rbac-proxy/0.log" Oct 09 08:46:02 crc kubenswrapper[4715]: I1009 08:46:02.031410 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-bgwc8_666e7073-bf77-46f3-99da-5ad2013835a9/manager/0.log" Oct 09 08:46:15 crc kubenswrapper[4715]: I1009 08:46:15.851983 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-prhvm_ad01aea7-211a-4ff5-b15b-fb696917dc52/control-plane-machine-set-operator/0.log" Oct 09 08:46:16 crc kubenswrapper[4715]: I1009 08:46:16.011899 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5tfh2_93259cc2-6847-41dc-a61d-83e7b9e67f3a/kube-rbac-proxy/0.log" Oct 09 08:46:16 crc kubenswrapper[4715]: I1009 08:46:16.022553 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5tfh2_93259cc2-6847-41dc-a61d-83e7b9e67f3a/machine-api-operator/0.log" Oct 09 08:46:16 crc kubenswrapper[4715]: I1009 08:46:16.753670 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:46:16 crc kubenswrapper[4715]: I1009 08:46:16.753724 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:46:16 crc kubenswrapper[4715]: I1009 08:46:16.753766 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 08:46:16 crc kubenswrapper[4715]: I1009 08:46:16.754497 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0987e6a342e02824b43b9834c2fc79356564b0a0d9274ff128b433f6ae2be4e5"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 08:46:16 crc kubenswrapper[4715]: I1009 08:46:16.754550 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://0987e6a342e02824b43b9834c2fc79356564b0a0d9274ff128b433f6ae2be4e5" gracePeriod=600 Oct 09 08:46:17 crc kubenswrapper[4715]: I1009 08:46:17.896563 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="0987e6a342e02824b43b9834c2fc79356564b0a0d9274ff128b433f6ae2be4e5" exitCode=0 Oct 09 08:46:17 crc kubenswrapper[4715]: I1009 08:46:17.896626 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"0987e6a342e02824b43b9834c2fc79356564b0a0d9274ff128b433f6ae2be4e5"} Oct 09 08:46:17 crc kubenswrapper[4715]: I1009 08:46:17.897104 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7"} Oct 09 08:46:17 crc kubenswrapper[4715]: I1009 08:46:17.897126 4715 scope.go:117] "RemoveContainer" containerID="e60ed8f42d1d41a28d869e553800a49419e0aee6f8579a58bcefd5162cfb674c" Oct 09 08:46:27 crc kubenswrapper[4715]: I1009 08:46:27.584566 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-vxwbp_b46c8c22-ef65-4617-90a9-bcef0954a010/cert-manager-controller/0.log" Oct 09 08:46:27 crc kubenswrapper[4715]: I1009 08:46:27.750046 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-4sgtl_9c7c75d0-8444-4edb-b653-5bc079b11d51/cert-manager-cainjector/0.log" Oct 09 08:46:27 crc kubenswrapper[4715]: I1009 08:46:27.752910 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-27vx6_acf4388b-06e6-4576-ae4d-b67ccda0c1ac/cert-manager-webhook/0.log" Oct 09 08:46:38 crc kubenswrapper[4715]: I1009 08:46:38.852804 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-4pcfg_02f7ed0f-b6c8-412c-b89a-a0a42d82a72d/nmstate-console-plugin/0.log" Oct 09 08:46:39 crc kubenswrapper[4715]: I1009 08:46:39.028329 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-cvrwm_264aef9d-e55d-41e9-b2e4-055db900d371/kube-rbac-proxy/0.log" Oct 09 08:46:39 crc kubenswrapper[4715]: I1009 08:46:39.037762 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-x6h8j_4d966a8c-9962-4542-9e72-fbd4959508e6/nmstate-handler/0.log" Oct 09 08:46:39 crc kubenswrapper[4715]: I1009 08:46:39.040943 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-cvrwm_264aef9d-e55d-41e9-b2e4-055db900d371/nmstate-metrics/0.log" Oct 09 08:46:39 crc kubenswrapper[4715]: I1009 08:46:39.266820 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-g94hr_c66a28bc-b6c9-426c-bc4d-b8748836b175/nmstate-operator/0.log" Oct 09 08:46:39 crc kubenswrapper[4715]: I1009 08:46:39.277041 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-bk7lw_f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6/nmstate-webhook/0.log" Oct 09 08:46:52 crc kubenswrapper[4715]: I1009 08:46:52.979155 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-plvwr_dc22841d-c047-4e47-a235-9025efe5d30e/kube-rbac-proxy/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.063468 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-plvwr_dc22841d-c047-4e47-a235-9025efe5d30e/controller/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.222433 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-frr-files/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.395175 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-frr-files/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.446069 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-metrics/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.472937 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-reloader/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.486168 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-reloader/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.635038 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-frr-files/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.646599 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-reloader/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.696662 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-metrics/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.702291 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-metrics/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.876855 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-metrics/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.881814 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-frr-files/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.952812 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-reloader/0.log" Oct 09 08:46:53 crc kubenswrapper[4715]: I1009 08:46:53.954898 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/controller/0.log" Oct 09 08:46:54 crc kubenswrapper[4715]: I1009 08:46:54.091938 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/frr-metrics/0.log" Oct 09 08:46:54 crc kubenswrapper[4715]: I1009 08:46:54.139360 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/kube-rbac-proxy-frr/0.log" Oct 09 08:46:54 crc kubenswrapper[4715]: I1009 08:46:54.165111 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/kube-rbac-proxy/0.log" Oct 09 08:46:54 crc kubenswrapper[4715]: I1009 08:46:54.369681 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/reloader/0.log" Oct 09 08:46:54 crc kubenswrapper[4715]: I1009 08:46:54.400646 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-fs56x_01a59281-8feb-446a-b861-fba9e4e8df7d/frr-k8s-webhook-server/0.log" Oct 09 08:46:54 crc kubenswrapper[4715]: I1009 08:46:54.732547 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-798678874c-cjtmr_6401cdb7-5b3a-4ae5-8944-fc923060aa09/manager/0.log" Oct 09 08:46:55 crc kubenswrapper[4715]: I1009 08:46:55.016645 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-865475978c-t8h2f_2b80db73-440c-49ef-8b24-187a67aab5db/webhook-server/0.log" Oct 09 08:46:55 crc kubenswrapper[4715]: I1009 08:46:55.019051 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-js45j_a5b391f0-e6c7-412a-8333-530a9ad5bab3/kube-rbac-proxy/0.log" Oct 09 08:46:55 crc kubenswrapper[4715]: I1009 08:46:55.198534 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/frr/0.log" Oct 09 08:46:55 crc kubenswrapper[4715]: I1009 08:46:55.656349 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-js45j_a5b391f0-e6c7-412a-8333-530a9ad5bab3/speaker/0.log" Oct 09 08:47:07 crc kubenswrapper[4715]: I1009 08:47:07.685117 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/util/0.log" Oct 09 08:47:07 crc kubenswrapper[4715]: I1009 08:47:07.865917 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/pull/0.log" Oct 09 08:47:07 crc kubenswrapper[4715]: I1009 08:47:07.893958 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/pull/0.log" Oct 09 08:47:07 crc kubenswrapper[4715]: I1009 08:47:07.925237 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/util/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.043841 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/util/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.058361 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/extract/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.096285 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/pull/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.200839 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/extract-utilities/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.351717 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/extract-utilities/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.388202 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/extract-content/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.418966 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/extract-content/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.593973 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/extract-utilities/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.658382 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/extract-content/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.790249 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/registry-server/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.837209 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/extract-utilities/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.967423 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/extract-content/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.967741 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/extract-content/0.log" Oct 09 08:47:08 crc kubenswrapper[4715]: I1009 08:47:08.981297 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/extract-utilities/0.log" Oct 09 08:47:09 crc kubenswrapper[4715]: I1009 08:47:09.162116 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/extract-content/0.log" Oct 09 08:47:09 crc kubenswrapper[4715]: I1009 08:47:09.170970 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/extract-utilities/0.log" Oct 09 08:47:09 crc kubenswrapper[4715]: I1009 08:47:09.375036 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/util/0.log" Oct 09 08:47:09 crc kubenswrapper[4715]: I1009 08:47:09.594995 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/util/0.log" Oct 09 08:47:09 crc kubenswrapper[4715]: I1009 08:47:09.603376 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/pull/0.log" Oct 09 08:47:09 crc kubenswrapper[4715]: I1009 08:47:09.628492 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/pull/0.log" Oct 09 08:47:09 crc kubenswrapper[4715]: I1009 08:47:09.839355 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/registry-server/0.log" Oct 09 08:47:09 crc kubenswrapper[4715]: I1009 08:47:09.859820 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/pull/0.log" Oct 09 08:47:09 crc kubenswrapper[4715]: I1009 08:47:09.863502 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/util/0.log" Oct 09 08:47:09 crc kubenswrapper[4715]: I1009 08:47:09.868648 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/extract/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.042644 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rpvsg_06c9829f-1dca-4ef6-a34f-a5380dfd729c/marketplace-operator/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.055838 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/extract-utilities/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.219464 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/extract-content/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.245517 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/extract-utilities/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.251626 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/extract-content/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.375749 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/extract-utilities/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.400298 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/extract-content/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.546639 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/registry-server/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.593980 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/extract-utilities/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.775804 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/extract-utilities/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.811068 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/extract-content/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.812409 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/extract-content/0.log" Oct 09 08:47:10 crc kubenswrapper[4715]: I1009 08:47:10.995993 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/extract-utilities/0.log" Oct 09 08:47:11 crc kubenswrapper[4715]: I1009 08:47:11.001031 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/extract-content/0.log" Oct 09 08:47:11 crc kubenswrapper[4715]: I1009 08:47:11.503986 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/registry-server/0.log" Oct 09 08:47:28 crc kubenswrapper[4715]: E1009 08:47:28.547022 4715 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.158:46392->38.102.83.158:33265: write tcp 38.102.83.158:46392->38.102.83.158:33265: write: broken pipe Oct 09 08:48:46 crc kubenswrapper[4715]: I1009 08:48:46.753150 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:48:46 crc kubenswrapper[4715]: I1009 08:48:46.754090 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:48:49 crc kubenswrapper[4715]: I1009 08:48:49.389909 4715 generic.go:334] "Generic (PLEG): container finished" podID="8e4bcd81-50d9-4f8c-90e6-fa6cba352454" containerID="8af00bf2a950f5533dc93fd0cd37d6f99600af8b94e7acc36dd85da6c2984268" exitCode=0 Oct 09 08:48:49 crc kubenswrapper[4715]: I1009 08:48:49.390237 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cp26h/must-gather-hqxds" event={"ID":"8e4bcd81-50d9-4f8c-90e6-fa6cba352454","Type":"ContainerDied","Data":"8af00bf2a950f5533dc93fd0cd37d6f99600af8b94e7acc36dd85da6c2984268"} Oct 09 08:48:49 crc kubenswrapper[4715]: I1009 08:48:49.391164 4715 scope.go:117] "RemoveContainer" containerID="8af00bf2a950f5533dc93fd0cd37d6f99600af8b94e7acc36dd85da6c2984268" Oct 09 08:48:49 crc kubenswrapper[4715]: I1009 08:48:49.476757 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cp26h_must-gather-hqxds_8e4bcd81-50d9-4f8c-90e6-fa6cba352454/gather/0.log" Oct 09 08:48:57 crc kubenswrapper[4715]: I1009 08:48:57.106447 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cp26h/must-gather-hqxds"] Oct 09 08:48:57 crc kubenswrapper[4715]: I1009 08:48:57.108159 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-cp26h/must-gather-hqxds" podUID="8e4bcd81-50d9-4f8c-90e6-fa6cba352454" containerName="copy" containerID="cri-o://b1eddd0e29e41dfafdb41e18d0ef3e133ed6f18795dc0edaafeb72b978305573" gracePeriod=2 Oct 09 08:48:57 crc kubenswrapper[4715]: I1009 08:48:57.113088 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cp26h/must-gather-hqxds"] Oct 09 08:48:57 crc kubenswrapper[4715]: I1009 08:48:57.489151 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cp26h_must-gather-hqxds_8e4bcd81-50d9-4f8c-90e6-fa6cba352454/copy/0.log" Oct 09 08:48:57 crc kubenswrapper[4715]: I1009 08:48:57.493138 4715 generic.go:334] "Generic (PLEG): container finished" podID="8e4bcd81-50d9-4f8c-90e6-fa6cba352454" containerID="b1eddd0e29e41dfafdb41e18d0ef3e133ed6f18795dc0edaafeb72b978305573" exitCode=143 Oct 09 08:48:58 crc kubenswrapper[4715]: I1009 08:48:58.097206 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cp26h_must-gather-hqxds_8e4bcd81-50d9-4f8c-90e6-fa6cba352454/copy/0.log" Oct 09 08:48:58 crc kubenswrapper[4715]: I1009 08:48:58.097946 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/must-gather-hqxds" Oct 09 08:48:58 crc kubenswrapper[4715]: I1009 08:48:58.269575 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5tl7\" (UniqueName: \"kubernetes.io/projected/8e4bcd81-50d9-4f8c-90e6-fa6cba352454-kube-api-access-c5tl7\") pod \"8e4bcd81-50d9-4f8c-90e6-fa6cba352454\" (UID: \"8e4bcd81-50d9-4f8c-90e6-fa6cba352454\") " Oct 09 08:48:58 crc kubenswrapper[4715]: I1009 08:48:58.269657 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e4bcd81-50d9-4f8c-90e6-fa6cba352454-must-gather-output\") pod \"8e4bcd81-50d9-4f8c-90e6-fa6cba352454\" (UID: \"8e4bcd81-50d9-4f8c-90e6-fa6cba352454\") " Oct 09 08:48:58 crc kubenswrapper[4715]: I1009 08:48:58.295080 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e4bcd81-50d9-4f8c-90e6-fa6cba352454-kube-api-access-c5tl7" (OuterVolumeSpecName: "kube-api-access-c5tl7") pod "8e4bcd81-50d9-4f8c-90e6-fa6cba352454" (UID: "8e4bcd81-50d9-4f8c-90e6-fa6cba352454"). InnerVolumeSpecName "kube-api-access-c5tl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:48:58 crc kubenswrapper[4715]: I1009 08:48:58.372268 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5tl7\" (UniqueName: \"kubernetes.io/projected/8e4bcd81-50d9-4f8c-90e6-fa6cba352454-kube-api-access-c5tl7\") on node \"crc\" DevicePath \"\"" Oct 09 08:48:58 crc kubenswrapper[4715]: I1009 08:48:58.416530 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e4bcd81-50d9-4f8c-90e6-fa6cba352454-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8e4bcd81-50d9-4f8c-90e6-fa6cba352454" (UID: "8e4bcd81-50d9-4f8c-90e6-fa6cba352454"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:48:58 crc kubenswrapper[4715]: I1009 08:48:58.474147 4715 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e4bcd81-50d9-4f8c-90e6-fa6cba352454-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 09 08:48:58 crc kubenswrapper[4715]: I1009 08:48:58.503465 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cp26h_must-gather-hqxds_8e4bcd81-50d9-4f8c-90e6-fa6cba352454/copy/0.log" Oct 09 08:48:58 crc kubenswrapper[4715]: I1009 08:48:58.503924 4715 scope.go:117] "RemoveContainer" containerID="b1eddd0e29e41dfafdb41e18d0ef3e133ed6f18795dc0edaafeb72b978305573" Oct 09 08:48:58 crc kubenswrapper[4715]: I1009 08:48:58.503971 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cp26h/must-gather-hqxds" Oct 09 08:48:58 crc kubenswrapper[4715]: I1009 08:48:58.536644 4715 scope.go:117] "RemoveContainer" containerID="8af00bf2a950f5533dc93fd0cd37d6f99600af8b94e7acc36dd85da6c2984268" Oct 09 08:49:00 crc kubenswrapper[4715]: I1009 08:49:00.152340 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e4bcd81-50d9-4f8c-90e6-fa6cba352454" path="/var/lib/kubelet/pods/8e4bcd81-50d9-4f8c-90e6-fa6cba352454/volumes" Oct 09 08:49:16 crc kubenswrapper[4715]: I1009 08:49:16.754074 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:49:16 crc kubenswrapper[4715]: I1009 08:49:16.754905 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.228268 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-49qll/must-gather-9bc7r"] Oct 09 08:49:37 crc kubenswrapper[4715]: E1009 08:49:37.229223 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07a752a-d111-447d-98a3-bb52336d8779" containerName="container-00" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.229237 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07a752a-d111-447d-98a3-bb52336d8779" containerName="container-00" Oct 09 08:49:37 crc kubenswrapper[4715]: E1009 08:49:37.229251 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e4bcd81-50d9-4f8c-90e6-fa6cba352454" containerName="gather" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.229258 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4bcd81-50d9-4f8c-90e6-fa6cba352454" containerName="gather" Oct 09 08:49:37 crc kubenswrapper[4715]: E1009 08:49:37.229286 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e4bcd81-50d9-4f8c-90e6-fa6cba352454" containerName="copy" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.229294 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4bcd81-50d9-4f8c-90e6-fa6cba352454" containerName="copy" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.229595 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e4bcd81-50d9-4f8c-90e6-fa6cba352454" containerName="copy" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.229611 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e4bcd81-50d9-4f8c-90e6-fa6cba352454" containerName="gather" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.229624 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07a752a-d111-447d-98a3-bb52336d8779" containerName="container-00" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.230743 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/must-gather-9bc7r" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.235724 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-49qll"/"openshift-service-ca.crt" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.235930 4715 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-49qll"/"kube-root-ca.crt" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.245129 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-49qll/must-gather-9bc7r"] Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.334738 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cd0920b-496d-4531-aeaa-ea492e0cdbb4-must-gather-output\") pod \"must-gather-9bc7r\" (UID: \"6cd0920b-496d-4531-aeaa-ea492e0cdbb4\") " pod="openshift-must-gather-49qll/must-gather-9bc7r" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.334806 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc754\" (UniqueName: \"kubernetes.io/projected/6cd0920b-496d-4531-aeaa-ea492e0cdbb4-kube-api-access-dc754\") pod \"must-gather-9bc7r\" (UID: \"6cd0920b-496d-4531-aeaa-ea492e0cdbb4\") " pod="openshift-must-gather-49qll/must-gather-9bc7r" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.436314 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cd0920b-496d-4531-aeaa-ea492e0cdbb4-must-gather-output\") pod \"must-gather-9bc7r\" (UID: \"6cd0920b-496d-4531-aeaa-ea492e0cdbb4\") " pod="openshift-must-gather-49qll/must-gather-9bc7r" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.436372 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc754\" (UniqueName: \"kubernetes.io/projected/6cd0920b-496d-4531-aeaa-ea492e0cdbb4-kube-api-access-dc754\") pod \"must-gather-9bc7r\" (UID: \"6cd0920b-496d-4531-aeaa-ea492e0cdbb4\") " pod="openshift-must-gather-49qll/must-gather-9bc7r" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.436862 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cd0920b-496d-4531-aeaa-ea492e0cdbb4-must-gather-output\") pod \"must-gather-9bc7r\" (UID: \"6cd0920b-496d-4531-aeaa-ea492e0cdbb4\") " pod="openshift-must-gather-49qll/must-gather-9bc7r" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.462025 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc754\" (UniqueName: \"kubernetes.io/projected/6cd0920b-496d-4531-aeaa-ea492e0cdbb4-kube-api-access-dc754\") pod \"must-gather-9bc7r\" (UID: \"6cd0920b-496d-4531-aeaa-ea492e0cdbb4\") " pod="openshift-must-gather-49qll/must-gather-9bc7r" Oct 09 08:49:37 crc kubenswrapper[4715]: I1009 08:49:37.558700 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/must-gather-9bc7r" Oct 09 08:49:38 crc kubenswrapper[4715]: I1009 08:49:38.020394 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-49qll/must-gather-9bc7r"] Oct 09 08:49:38 crc kubenswrapper[4715]: I1009 08:49:38.899831 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-49qll/must-gather-9bc7r" event={"ID":"6cd0920b-496d-4531-aeaa-ea492e0cdbb4","Type":"ContainerStarted","Data":"aac398f92dd8f424b0c12d756d6385a67ee8f3c054e25091272b4d83f5058375"} Oct 09 08:49:38 crc kubenswrapper[4715]: I1009 08:49:38.900330 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-49qll/must-gather-9bc7r" event={"ID":"6cd0920b-496d-4531-aeaa-ea492e0cdbb4","Type":"ContainerStarted","Data":"67b7fb61ac0127a891f825744cc58d7904a39efc631950b640c773eb30b1cf6c"} Oct 09 08:49:38 crc kubenswrapper[4715]: I1009 08:49:38.900343 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-49qll/must-gather-9bc7r" event={"ID":"6cd0920b-496d-4531-aeaa-ea492e0cdbb4","Type":"ContainerStarted","Data":"ddf09579214e86a7e03510541936e3d8f8e763c60b82c29de6f2b91277c3f248"} Oct 09 08:49:38 crc kubenswrapper[4715]: I1009 08:49:38.919566 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-49qll/must-gather-9bc7r" podStartSLOduration=1.919548523 podStartE2EDuration="1.919548523s" podCreationTimestamp="2025-10-09 08:49:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:49:38.913679945 +0000 UTC m=+3809.606483953" watchObservedRunningTime="2025-10-09 08:49:38.919548523 +0000 UTC m=+3809.612352541" Oct 09 08:49:41 crc kubenswrapper[4715]: I1009 08:49:41.752895 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-49qll/crc-debug-s49w9"] Oct 09 08:49:41 crc kubenswrapper[4715]: I1009 08:49:41.755177 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/crc-debug-s49w9" Oct 09 08:49:41 crc kubenswrapper[4715]: I1009 08:49:41.757588 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-49qll"/"default-dockercfg-kwhvg" Oct 09 08:49:41 crc kubenswrapper[4715]: I1009 08:49:41.930322 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jq25\" (UniqueName: \"kubernetes.io/projected/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05-kube-api-access-4jq25\") pod \"crc-debug-s49w9\" (UID: \"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05\") " pod="openshift-must-gather-49qll/crc-debug-s49w9" Oct 09 08:49:41 crc kubenswrapper[4715]: I1009 08:49:41.930413 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05-host\") pod \"crc-debug-s49w9\" (UID: \"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05\") " pod="openshift-must-gather-49qll/crc-debug-s49w9" Oct 09 08:49:42 crc kubenswrapper[4715]: I1009 08:49:42.031616 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jq25\" (UniqueName: \"kubernetes.io/projected/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05-kube-api-access-4jq25\") pod \"crc-debug-s49w9\" (UID: \"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05\") " pod="openshift-must-gather-49qll/crc-debug-s49w9" Oct 09 08:49:42 crc kubenswrapper[4715]: I1009 08:49:42.031736 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05-host\") pod \"crc-debug-s49w9\" (UID: \"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05\") " pod="openshift-must-gather-49qll/crc-debug-s49w9" Oct 09 08:49:42 crc kubenswrapper[4715]: I1009 08:49:42.031915 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05-host\") pod \"crc-debug-s49w9\" (UID: \"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05\") " pod="openshift-must-gather-49qll/crc-debug-s49w9" Oct 09 08:49:42 crc kubenswrapper[4715]: I1009 08:49:42.059071 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jq25\" (UniqueName: \"kubernetes.io/projected/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05-kube-api-access-4jq25\") pod \"crc-debug-s49w9\" (UID: \"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05\") " pod="openshift-must-gather-49qll/crc-debug-s49w9" Oct 09 08:49:42 crc kubenswrapper[4715]: I1009 08:49:42.084513 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/crc-debug-s49w9" Oct 09 08:49:42 crc kubenswrapper[4715]: W1009 08:49:42.121926 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda59ee6ac_dc4f_4d5b_96d8_b065a5fc1c05.slice/crio-7b981fc7c250c1019be49d2026289e048012f6a19421579707c29340c80bb447 WatchSource:0}: Error finding container 7b981fc7c250c1019be49d2026289e048012f6a19421579707c29340c80bb447: Status 404 returned error can't find the container with id 7b981fc7c250c1019be49d2026289e048012f6a19421579707c29340c80bb447 Oct 09 08:49:42 crc kubenswrapper[4715]: I1009 08:49:42.938824 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-49qll/crc-debug-s49w9" event={"ID":"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05","Type":"ContainerStarted","Data":"c0107ed515997f9e55b335002f005d0ed6427dcca55541b552e9a73ac993b5b8"} Oct 09 08:49:42 crc kubenswrapper[4715]: I1009 08:49:42.939410 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-49qll/crc-debug-s49w9" event={"ID":"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05","Type":"ContainerStarted","Data":"7b981fc7c250c1019be49d2026289e048012f6a19421579707c29340c80bb447"} Oct 09 08:49:46 crc kubenswrapper[4715]: I1009 08:49:46.753439 4715 patch_prober.go:28] interesting pod/machine-config-daemon-k7vwx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 09 08:49:46 crc kubenswrapper[4715]: I1009 08:49:46.753990 4715 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 09 08:49:46 crc kubenswrapper[4715]: I1009 08:49:46.754056 4715 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" Oct 09 08:49:46 crc kubenswrapper[4715]: I1009 08:49:46.754976 4715 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7"} pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 09 08:49:46 crc kubenswrapper[4715]: I1009 08:49:46.755031 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerName="machine-config-daemon" containerID="cri-o://3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" gracePeriod=600 Oct 09 08:49:46 crc kubenswrapper[4715]: E1009 08:49:46.876282 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:49:46 crc kubenswrapper[4715]: I1009 08:49:46.990868 4715 generic.go:334] "Generic (PLEG): container finished" podID="acafd807-8875-4b4f-aba9-4f807ca336e7" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" exitCode=0 Oct 09 08:49:46 crc kubenswrapper[4715]: I1009 08:49:46.990950 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerDied","Data":"3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7"} Oct 09 08:49:46 crc kubenswrapper[4715]: I1009 08:49:46.990988 4715 scope.go:117] "RemoveContainer" containerID="0987e6a342e02824b43b9834c2fc79356564b0a0d9274ff128b433f6ae2be4e5" Oct 09 08:49:46 crc kubenswrapper[4715]: I1009 08:49:46.991690 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:49:46 crc kubenswrapper[4715]: E1009 08:49:46.991969 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:49:47 crc kubenswrapper[4715]: I1009 08:49:47.021022 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-49qll/crc-debug-s49w9" podStartSLOduration=6.020998778 podStartE2EDuration="6.020998778s" podCreationTimestamp="2025-10-09 08:49:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-09 08:49:42.954919906 +0000 UTC m=+3813.647723914" watchObservedRunningTime="2025-10-09 08:49:47.020998778 +0000 UTC m=+3817.713802786" Oct 09 08:49:57 crc kubenswrapper[4715]: I1009 08:49:57.137622 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:49:57 crc kubenswrapper[4715]: E1009 08:49:57.139600 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:50:12 crc kubenswrapper[4715]: I1009 08:50:12.136866 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:50:12 crc kubenswrapper[4715]: E1009 08:50:12.137747 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:50:19 crc kubenswrapper[4715]: I1009 08:50:19.285021 4715 generic.go:334] "Generic (PLEG): container finished" podID="a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05" containerID="c0107ed515997f9e55b335002f005d0ed6427dcca55541b552e9a73ac993b5b8" exitCode=0 Oct 09 08:50:19 crc kubenswrapper[4715]: I1009 08:50:19.285098 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-49qll/crc-debug-s49w9" event={"ID":"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05","Type":"ContainerDied","Data":"c0107ed515997f9e55b335002f005d0ed6427dcca55541b552e9a73ac993b5b8"} Oct 09 08:50:20 crc kubenswrapper[4715]: I1009 08:50:20.401288 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/crc-debug-s49w9" Oct 09 08:50:20 crc kubenswrapper[4715]: I1009 08:50:20.435261 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-49qll/crc-debug-s49w9"] Oct 09 08:50:20 crc kubenswrapper[4715]: I1009 08:50:20.444948 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-49qll/crc-debug-s49w9"] Oct 09 08:50:20 crc kubenswrapper[4715]: I1009 08:50:20.488225 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05-host\") pod \"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05\" (UID: \"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05\") " Oct 09 08:50:20 crc kubenswrapper[4715]: I1009 08:50:20.488296 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05-host" (OuterVolumeSpecName: "host") pod "a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05" (UID: "a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 08:50:20 crc kubenswrapper[4715]: I1009 08:50:20.488376 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jq25\" (UniqueName: \"kubernetes.io/projected/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05-kube-api-access-4jq25\") pod \"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05\" (UID: \"a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05\") " Oct 09 08:50:20 crc kubenswrapper[4715]: I1009 08:50:20.488844 4715 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05-host\") on node \"crc\" DevicePath \"\"" Oct 09 08:50:20 crc kubenswrapper[4715]: I1009 08:50:20.496236 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05-kube-api-access-4jq25" (OuterVolumeSpecName: "kube-api-access-4jq25") pod "a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05" (UID: "a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05"). InnerVolumeSpecName "kube-api-access-4jq25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:50:20 crc kubenswrapper[4715]: I1009 08:50:20.590052 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jq25\" (UniqueName: \"kubernetes.io/projected/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05-kube-api-access-4jq25\") on node \"crc\" DevicePath \"\"" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.308287 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b981fc7c250c1019be49d2026289e048012f6a19421579707c29340c80bb447" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.308507 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/crc-debug-s49w9" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.647991 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-49qll/crc-debug-czx6g"] Oct 09 08:50:21 crc kubenswrapper[4715]: E1009 08:50:21.648380 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05" containerName="container-00" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.648392 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05" containerName="container-00" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.648631 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05" containerName="container-00" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.649301 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/crc-debug-czx6g" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.651474 4715 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-49qll"/"default-dockercfg-kwhvg" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.709397 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75d0c644-4d76-4e6b-a398-d087283305e0-host\") pod \"crc-debug-czx6g\" (UID: \"75d0c644-4d76-4e6b-a398-d087283305e0\") " pod="openshift-must-gather-49qll/crc-debug-czx6g" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.709607 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txl6j\" (UniqueName: \"kubernetes.io/projected/75d0c644-4d76-4e6b-a398-d087283305e0-kube-api-access-txl6j\") pod \"crc-debug-czx6g\" (UID: \"75d0c644-4d76-4e6b-a398-d087283305e0\") " pod="openshift-must-gather-49qll/crc-debug-czx6g" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.811233 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75d0c644-4d76-4e6b-a398-d087283305e0-host\") pod \"crc-debug-czx6g\" (UID: \"75d0c644-4d76-4e6b-a398-d087283305e0\") " pod="openshift-must-gather-49qll/crc-debug-czx6g" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.811312 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txl6j\" (UniqueName: \"kubernetes.io/projected/75d0c644-4d76-4e6b-a398-d087283305e0-kube-api-access-txl6j\") pod \"crc-debug-czx6g\" (UID: \"75d0c644-4d76-4e6b-a398-d087283305e0\") " pod="openshift-must-gather-49qll/crc-debug-czx6g" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.811445 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75d0c644-4d76-4e6b-a398-d087283305e0-host\") pod \"crc-debug-czx6g\" (UID: \"75d0c644-4d76-4e6b-a398-d087283305e0\") " pod="openshift-must-gather-49qll/crc-debug-czx6g" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.839285 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txl6j\" (UniqueName: \"kubernetes.io/projected/75d0c644-4d76-4e6b-a398-d087283305e0-kube-api-access-txl6j\") pod \"crc-debug-czx6g\" (UID: \"75d0c644-4d76-4e6b-a398-d087283305e0\") " pod="openshift-must-gather-49qll/crc-debug-czx6g" Oct 09 08:50:21 crc kubenswrapper[4715]: I1009 08:50:21.971594 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/crc-debug-czx6g" Oct 09 08:50:22 crc kubenswrapper[4715]: I1009 08:50:22.148711 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05" path="/var/lib/kubelet/pods/a59ee6ac-dc4f-4d5b-96d8-b065a5fc1c05/volumes" Oct 09 08:50:22 crc kubenswrapper[4715]: I1009 08:50:22.319558 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-49qll/crc-debug-czx6g" event={"ID":"75d0c644-4d76-4e6b-a398-d087283305e0","Type":"ContainerStarted","Data":"f6a3b187948d697ddfca7ab20d8c5efc9e981d6329a64e176d04c5df1ead9cd5"} Oct 09 08:50:22 crc kubenswrapper[4715]: I1009 08:50:22.319842 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-49qll/crc-debug-czx6g" event={"ID":"75d0c644-4d76-4e6b-a398-d087283305e0","Type":"ContainerStarted","Data":"809a324892d28848b9347ac8a76e342cd0c555e97cb4745fd6464f2748228a03"} Oct 09 08:50:22 crc kubenswrapper[4715]: I1009 08:50:22.772446 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-49qll/crc-debug-czx6g"] Oct 09 08:50:22 crc kubenswrapper[4715]: I1009 08:50:22.783150 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-49qll/crc-debug-czx6g"] Oct 09 08:50:23 crc kubenswrapper[4715]: I1009 08:50:23.328783 4715 generic.go:334] "Generic (PLEG): container finished" podID="75d0c644-4d76-4e6b-a398-d087283305e0" containerID="f6a3b187948d697ddfca7ab20d8c5efc9e981d6329a64e176d04c5df1ead9cd5" exitCode=0 Oct 09 08:50:23 crc kubenswrapper[4715]: I1009 08:50:23.427580 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/crc-debug-czx6g" Oct 09 08:50:23 crc kubenswrapper[4715]: I1009 08:50:23.435879 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75d0c644-4d76-4e6b-a398-d087283305e0-host\") pod \"75d0c644-4d76-4e6b-a398-d087283305e0\" (UID: \"75d0c644-4d76-4e6b-a398-d087283305e0\") " Oct 09 08:50:23 crc kubenswrapper[4715]: I1009 08:50:23.435956 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75d0c644-4d76-4e6b-a398-d087283305e0-host" (OuterVolumeSpecName: "host") pod "75d0c644-4d76-4e6b-a398-d087283305e0" (UID: "75d0c644-4d76-4e6b-a398-d087283305e0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 08:50:23 crc kubenswrapper[4715]: I1009 08:50:23.436052 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txl6j\" (UniqueName: \"kubernetes.io/projected/75d0c644-4d76-4e6b-a398-d087283305e0-kube-api-access-txl6j\") pod \"75d0c644-4d76-4e6b-a398-d087283305e0\" (UID: \"75d0c644-4d76-4e6b-a398-d087283305e0\") " Oct 09 08:50:23 crc kubenswrapper[4715]: I1009 08:50:23.436354 4715 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75d0c644-4d76-4e6b-a398-d087283305e0-host\") on node \"crc\" DevicePath \"\"" Oct 09 08:50:23 crc kubenswrapper[4715]: I1009 08:50:23.443789 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d0c644-4d76-4e6b-a398-d087283305e0-kube-api-access-txl6j" (OuterVolumeSpecName: "kube-api-access-txl6j") pod "75d0c644-4d76-4e6b-a398-d087283305e0" (UID: "75d0c644-4d76-4e6b-a398-d087283305e0"). InnerVolumeSpecName "kube-api-access-txl6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:50:23 crc kubenswrapper[4715]: I1009 08:50:23.537950 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txl6j\" (UniqueName: \"kubernetes.io/projected/75d0c644-4d76-4e6b-a398-d087283305e0-kube-api-access-txl6j\") on node \"crc\" DevicePath \"\"" Oct 09 08:50:23 crc kubenswrapper[4715]: I1009 08:50:23.964665 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-49qll/crc-debug-thnz8"] Oct 09 08:50:23 crc kubenswrapper[4715]: E1009 08:50:23.965008 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d0c644-4d76-4e6b-a398-d087283305e0" containerName="container-00" Oct 09 08:50:23 crc kubenswrapper[4715]: I1009 08:50:23.965019 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d0c644-4d76-4e6b-a398-d087283305e0" containerName="container-00" Oct 09 08:50:23 crc kubenswrapper[4715]: I1009 08:50:23.965183 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d0c644-4d76-4e6b-a398-d087283305e0" containerName="container-00" Oct 09 08:50:23 crc kubenswrapper[4715]: I1009 08:50:23.965766 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/crc-debug-thnz8" Oct 09 08:50:24 crc kubenswrapper[4715]: I1009 08:50:24.047599 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkk68\" (UniqueName: \"kubernetes.io/projected/a2ff4e22-982b-45e8-944d-87f907cfcd35-kube-api-access-bkk68\") pod \"crc-debug-thnz8\" (UID: \"a2ff4e22-982b-45e8-944d-87f907cfcd35\") " pod="openshift-must-gather-49qll/crc-debug-thnz8" Oct 09 08:50:24 crc kubenswrapper[4715]: I1009 08:50:24.047673 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ff4e22-982b-45e8-944d-87f907cfcd35-host\") pod \"crc-debug-thnz8\" (UID: \"a2ff4e22-982b-45e8-944d-87f907cfcd35\") " pod="openshift-must-gather-49qll/crc-debug-thnz8" Oct 09 08:50:24 crc kubenswrapper[4715]: I1009 08:50:24.155018 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkk68\" (UniqueName: \"kubernetes.io/projected/a2ff4e22-982b-45e8-944d-87f907cfcd35-kube-api-access-bkk68\") pod \"crc-debug-thnz8\" (UID: \"a2ff4e22-982b-45e8-944d-87f907cfcd35\") " pod="openshift-must-gather-49qll/crc-debug-thnz8" Oct 09 08:50:24 crc kubenswrapper[4715]: I1009 08:50:24.155081 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ff4e22-982b-45e8-944d-87f907cfcd35-host\") pod \"crc-debug-thnz8\" (UID: \"a2ff4e22-982b-45e8-944d-87f907cfcd35\") " pod="openshift-must-gather-49qll/crc-debug-thnz8" Oct 09 08:50:24 crc kubenswrapper[4715]: I1009 08:50:24.157712 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d0c644-4d76-4e6b-a398-d087283305e0" path="/var/lib/kubelet/pods/75d0c644-4d76-4e6b-a398-d087283305e0/volumes" Oct 09 08:50:24 crc kubenswrapper[4715]: I1009 08:50:24.164317 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ff4e22-982b-45e8-944d-87f907cfcd35-host\") pod \"crc-debug-thnz8\" (UID: \"a2ff4e22-982b-45e8-944d-87f907cfcd35\") " pod="openshift-must-gather-49qll/crc-debug-thnz8" Oct 09 08:50:24 crc kubenswrapper[4715]: I1009 08:50:24.178998 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkk68\" (UniqueName: \"kubernetes.io/projected/a2ff4e22-982b-45e8-944d-87f907cfcd35-kube-api-access-bkk68\") pod \"crc-debug-thnz8\" (UID: \"a2ff4e22-982b-45e8-944d-87f907cfcd35\") " pod="openshift-must-gather-49qll/crc-debug-thnz8" Oct 09 08:50:24 crc kubenswrapper[4715]: I1009 08:50:24.283075 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/crc-debug-thnz8" Oct 09 08:50:24 crc kubenswrapper[4715]: I1009 08:50:24.339370 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-49qll/crc-debug-thnz8" event={"ID":"a2ff4e22-982b-45e8-944d-87f907cfcd35","Type":"ContainerStarted","Data":"a76daecabce52b5b54288f0ad355c7edc022bb993a64a9c1bed98ab9bea39c93"} Oct 09 08:50:24 crc kubenswrapper[4715]: I1009 08:50:24.341394 4715 scope.go:117] "RemoveContainer" containerID="f6a3b187948d697ddfca7ab20d8c5efc9e981d6329a64e176d04c5df1ead9cd5" Oct 09 08:50:24 crc kubenswrapper[4715]: I1009 08:50:24.341492 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/crc-debug-czx6g" Oct 09 08:50:25 crc kubenswrapper[4715]: I1009 08:50:25.136769 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:50:25 crc kubenswrapper[4715]: E1009 08:50:25.137496 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:50:25 crc kubenswrapper[4715]: I1009 08:50:25.351825 4715 generic.go:334] "Generic (PLEG): container finished" podID="a2ff4e22-982b-45e8-944d-87f907cfcd35" containerID="1b70a4bc294c8c2c8c5c4f7f123cbfc4f33a07886a349d1a504c059512606928" exitCode=0 Oct 09 08:50:25 crc kubenswrapper[4715]: I1009 08:50:25.351868 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-49qll/crc-debug-thnz8" event={"ID":"a2ff4e22-982b-45e8-944d-87f907cfcd35","Type":"ContainerDied","Data":"1b70a4bc294c8c2c8c5c4f7f123cbfc4f33a07886a349d1a504c059512606928"} Oct 09 08:50:25 crc kubenswrapper[4715]: I1009 08:50:25.396994 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-49qll/crc-debug-thnz8"] Oct 09 08:50:25 crc kubenswrapper[4715]: I1009 08:50:25.403781 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-49qll/crc-debug-thnz8"] Oct 09 08:50:26 crc kubenswrapper[4715]: I1009 08:50:26.474870 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/crc-debug-thnz8" Oct 09 08:50:26 crc kubenswrapper[4715]: I1009 08:50:26.596086 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ff4e22-982b-45e8-944d-87f907cfcd35-host\") pod \"a2ff4e22-982b-45e8-944d-87f907cfcd35\" (UID: \"a2ff4e22-982b-45e8-944d-87f907cfcd35\") " Oct 09 08:50:26 crc kubenswrapper[4715]: I1009 08:50:26.596221 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2ff4e22-982b-45e8-944d-87f907cfcd35-host" (OuterVolumeSpecName: "host") pod "a2ff4e22-982b-45e8-944d-87f907cfcd35" (UID: "a2ff4e22-982b-45e8-944d-87f907cfcd35"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 09 08:50:26 crc kubenswrapper[4715]: I1009 08:50:26.596237 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkk68\" (UniqueName: \"kubernetes.io/projected/a2ff4e22-982b-45e8-944d-87f907cfcd35-kube-api-access-bkk68\") pod \"a2ff4e22-982b-45e8-944d-87f907cfcd35\" (UID: \"a2ff4e22-982b-45e8-944d-87f907cfcd35\") " Oct 09 08:50:26 crc kubenswrapper[4715]: I1009 08:50:26.597115 4715 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ff4e22-982b-45e8-944d-87f907cfcd35-host\") on node \"crc\" DevicePath \"\"" Oct 09 08:50:26 crc kubenswrapper[4715]: I1009 08:50:26.601779 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ff4e22-982b-45e8-944d-87f907cfcd35-kube-api-access-bkk68" (OuterVolumeSpecName: "kube-api-access-bkk68") pod "a2ff4e22-982b-45e8-944d-87f907cfcd35" (UID: "a2ff4e22-982b-45e8-944d-87f907cfcd35"). InnerVolumeSpecName "kube-api-access-bkk68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:50:26 crc kubenswrapper[4715]: I1009 08:50:26.698707 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkk68\" (UniqueName: \"kubernetes.io/projected/a2ff4e22-982b-45e8-944d-87f907cfcd35-kube-api-access-bkk68\") on node \"crc\" DevicePath \"\"" Oct 09 08:50:27 crc kubenswrapper[4715]: I1009 08:50:27.372238 4715 scope.go:117] "RemoveContainer" containerID="1b70a4bc294c8c2c8c5c4f7f123cbfc4f33a07886a349d1a504c059512606928" Oct 09 08:50:27 crc kubenswrapper[4715]: I1009 08:50:27.372281 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/crc-debug-thnz8" Oct 09 08:50:28 crc kubenswrapper[4715]: I1009 08:50:28.147656 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2ff4e22-982b-45e8-944d-87f907cfcd35" path="/var/lib/kubelet/pods/a2ff4e22-982b-45e8-944d-87f907cfcd35/volumes" Oct 09 08:50:37 crc kubenswrapper[4715]: I1009 08:50:37.301102 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-67f46ddd46-qcgrk_cb469864-9053-4551-bca7-f3b67a20bf52/barbican-api/0.log" Oct 09 08:50:37 crc kubenswrapper[4715]: I1009 08:50:37.467955 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-67f46ddd46-qcgrk_cb469864-9053-4551-bca7-f3b67a20bf52/barbican-api-log/0.log" Oct 09 08:50:37 crc kubenswrapper[4715]: I1009 08:50:37.573411 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58cfd869c4-4djjf_50cf187d-781d-49b7-840b-8dfc3366135f/barbican-keystone-listener/0.log" Oct 09 08:50:37 crc kubenswrapper[4715]: I1009 08:50:37.668765 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-58cfd869c4-4djjf_50cf187d-781d-49b7-840b-8dfc3366135f/barbican-keystone-listener-log/0.log" Oct 09 08:50:37 crc kubenswrapper[4715]: I1009 08:50:37.764212 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6ff4c65c6c-97dbp_6d84057b-c735-4df6-a20d-ef88cccb44fe/barbican-worker/0.log" Oct 09 08:50:37 crc kubenswrapper[4715]: I1009 08:50:37.801682 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6ff4c65c6c-97dbp_6d84057b-c735-4df6-a20d-ef88cccb44fe/barbican-worker-log/0.log" Oct 09 08:50:38 crc kubenswrapper[4715]: I1009 08:50:38.011055 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-jphqg_21e35629-c64d-4ef6-a570-7603aa8358fb/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:38 crc kubenswrapper[4715]: I1009 08:50:38.095569 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7574fa02-4d40-4d5f-8d52-3118db1c2e05/ceilometer-central-agent/0.log" Oct 09 08:50:38 crc kubenswrapper[4715]: I1009 08:50:38.220560 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7574fa02-4d40-4d5f-8d52-3118db1c2e05/ceilometer-notification-agent/0.log" Oct 09 08:50:38 crc kubenswrapper[4715]: I1009 08:50:38.235998 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7574fa02-4d40-4d5f-8d52-3118db1c2e05/sg-core/0.log" Oct 09 08:50:38 crc kubenswrapper[4715]: I1009 08:50:38.245106 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7574fa02-4d40-4d5f-8d52-3118db1c2e05/proxy-httpd/0.log" Oct 09 08:50:38 crc kubenswrapper[4715]: I1009 08:50:38.474523 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_50651334-64c9-4214-9c7b-c10c4152d053/cinder-api/0.log" Oct 09 08:50:38 crc kubenswrapper[4715]: I1009 08:50:38.507136 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_50651334-64c9-4214-9c7b-c10c4152d053/cinder-api-log/0.log" Oct 09 08:50:38 crc kubenswrapper[4715]: I1009 08:50:38.605372 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_48296f2f-dddc-4549-9f78-640128d54d46/cinder-scheduler/0.log" Oct 09 08:50:38 crc kubenswrapper[4715]: I1009 08:50:38.729034 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_48296f2f-dddc-4549-9f78-640128d54d46/probe/0.log" Oct 09 08:50:38 crc kubenswrapper[4715]: I1009 08:50:38.810288 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-j7tdx_69340195-a6f5-4e04-823d-9d61548a14b9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:38 crc kubenswrapper[4715]: I1009 08:50:38.987277 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5rwzv_4220a571-fb05-4098-901b-a00a6c79efe8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:39 crc kubenswrapper[4715]: I1009 08:50:39.122720 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-jtk8t_6d2dbb06-3154-428b-991c-09b567c9136e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:39 crc kubenswrapper[4715]: I1009 08:50:39.216761 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-6clgv_9d37f947-6e34-45b8-96a5-a18465d3f3fd/init/0.log" Oct 09 08:50:39 crc kubenswrapper[4715]: I1009 08:50:39.456573 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-hngw7_a796e5b4-3af9-4286-8e0f-44f5a026dc47/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:39 crc kubenswrapper[4715]: I1009 08:50:39.473736 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-6clgv_9d37f947-6e34-45b8-96a5-a18465d3f3fd/init/0.log" Oct 09 08:50:39 crc kubenswrapper[4715]: I1009 08:50:39.535161 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-6clgv_9d37f947-6e34-45b8-96a5-a18465d3f3fd/dnsmasq-dns/0.log" Oct 09 08:50:39 crc kubenswrapper[4715]: I1009 08:50:39.718906 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_351123b4-0e0c-413a-bc50-56f397c1b592/glance-httpd/0.log" Oct 09 08:50:39 crc kubenswrapper[4715]: I1009 08:50:39.734375 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_351123b4-0e0c-413a-bc50-56f397c1b592/glance-log/0.log" Oct 09 08:50:39 crc kubenswrapper[4715]: I1009 08:50:39.884115 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8be2de3b-f820-4674-992a-9bf1a1735d6b/glance-httpd/0.log" Oct 09 08:50:39 crc kubenswrapper[4715]: I1009 08:50:39.958973 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8be2de3b-f820-4674-992a-9bf1a1735d6b/glance-log/0.log" Oct 09 08:50:40 crc kubenswrapper[4715]: I1009 08:50:40.070389 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d788d6d48-5nczq_7b8b0665-2ab8-4fb9-93ff-6405324f24d5/horizon/0.log" Oct 09 08:50:40 crc kubenswrapper[4715]: I1009 08:50:40.147160 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:50:40 crc kubenswrapper[4715]: E1009 08:50:40.147712 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:50:40 crc kubenswrapper[4715]: I1009 08:50:40.230223 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-mgwkz_34e4a21c-b0d6-448f-9fb9-42f65187fad8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:40 crc kubenswrapper[4715]: I1009 08:50:40.474346 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-d788d6d48-5nczq_7b8b0665-2ab8-4fb9-93ff-6405324f24d5/horizon-log/0.log" Oct 09 08:50:40 crc kubenswrapper[4715]: I1009 08:50:40.511603 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-7ps4g_bf57a3ae-e445-4a20-9bc1-c5c8480f9158/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:40 crc kubenswrapper[4715]: I1009 08:50:40.765226 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f8c92457-ab26-4a48-b7e1-094eac8532c7/kube-state-metrics/0.log" Oct 09 08:50:40 crc kubenswrapper[4715]: I1009 08:50:40.833131 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76cc5dfd5-dcg58_914f4753-0cbe-4496-b703-8dd106c06db2/keystone-api/0.log" Oct 09 08:50:40 crc kubenswrapper[4715]: I1009 08:50:40.974709 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-whz64_767ac586-f48e-410c-a5bb-589eccbef2c8/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:41 crc kubenswrapper[4715]: I1009 08:50:41.268478 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85df7c4d7c-7ktz2_03dac8b3-a92c-49b7-94cd-f7ab774b7e65/neutron-api/0.log" Oct 09 08:50:41 crc kubenswrapper[4715]: I1009 08:50:41.332182 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-85df7c4d7c-7ktz2_03dac8b3-a92c-49b7-94cd-f7ab774b7e65/neutron-httpd/0.log" Oct 09 08:50:41 crc kubenswrapper[4715]: I1009 08:50:41.349867 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-fm7zq_1b4c9589-4c0f-4f07-86d8-4573a0e80292/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:41 crc kubenswrapper[4715]: I1009 08:50:41.934236 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0c807bba-12d5-4e33-894d-7f1ae6faa077/nova-api-log/0.log" Oct 09 08:50:42 crc kubenswrapper[4715]: I1009 08:50:42.011929 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b0f42f3a-98f5-442f-a169-3d7080e5fea3/nova-cell0-conductor-conductor/0.log" Oct 09 08:50:42 crc kubenswrapper[4715]: I1009 08:50:42.399619 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_0c807bba-12d5-4e33-894d-7f1ae6faa077/nova-api-api/0.log" Oct 09 08:50:42 crc kubenswrapper[4715]: I1009 08:50:42.413346 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f2c8b523-67fd-40d3-9e2b-eb68619f60bc/nova-cell1-conductor-conductor/0.log" Oct 09 08:50:42 crc kubenswrapper[4715]: I1009 08:50:42.461149 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_84ca6d97-8374-4d3e-a5e7-e475bd7f89ce/nova-cell1-novncproxy-novncproxy/0.log" Oct 09 08:50:42 crc kubenswrapper[4715]: I1009 08:50:42.652695 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-sb8x7_7de82685-bfc0-41e5-81db-cadda0dc8d65/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:42 crc kubenswrapper[4715]: I1009 08:50:42.869806 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_76afd031-2e1d-412c-a21b-08e597e8eb83/nova-metadata-log/0.log" Oct 09 08:50:43 crc kubenswrapper[4715]: I1009 08:50:43.174322 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_72f969cd-b504-4db1-832a-1e0c7f0a3b7b/mysql-bootstrap/0.log" Oct 09 08:50:43 crc kubenswrapper[4715]: I1009 08:50:43.228378 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e9fdd59d-3bb3-4f3f-85af-316ddc7de166/nova-scheduler-scheduler/0.log" Oct 09 08:50:43 crc kubenswrapper[4715]: I1009 08:50:43.369345 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_72f969cd-b504-4db1-832a-1e0c7f0a3b7b/galera/0.log" Oct 09 08:50:43 crc kubenswrapper[4715]: I1009 08:50:43.442103 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_72f969cd-b504-4db1-832a-1e0c7f0a3b7b/mysql-bootstrap/0.log" Oct 09 08:50:43 crc kubenswrapper[4715]: I1009 08:50:43.609785 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_895dedfd-5a74-43a4-81d1-6365aa67ed6a/mysql-bootstrap/0.log" Oct 09 08:50:43 crc kubenswrapper[4715]: I1009 08:50:43.910833 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_895dedfd-5a74-43a4-81d1-6365aa67ed6a/mysql-bootstrap/0.log" Oct 09 08:50:43 crc kubenswrapper[4715]: I1009 08:50:43.915361 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_895dedfd-5a74-43a4-81d1-6365aa67ed6a/galera/0.log" Oct 09 08:50:44 crc kubenswrapper[4715]: I1009 08:50:44.149220 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ea37012b-c593-4cd0-8501-121c791b2741/openstackclient/0.log" Oct 09 08:50:44 crc kubenswrapper[4715]: I1009 08:50:44.164624 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-ptkjm_f16a1c8a-dd0e-4800-8b4f-8ae0dd86dbd2/openstack-network-exporter/0.log" Oct 09 08:50:44 crc kubenswrapper[4715]: I1009 08:50:44.310967 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_76afd031-2e1d-412c-a21b-08e597e8eb83/nova-metadata-metadata/0.log" Oct 09 08:50:44 crc kubenswrapper[4715]: I1009 08:50:44.428210 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gckmt_2db0914e-e011-4b76-a07d-57ce73faceaa/ovsdb-server-init/0.log" Oct 09 08:50:44 crc kubenswrapper[4715]: I1009 08:50:44.746564 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gckmt_2db0914e-e011-4b76-a07d-57ce73faceaa/ovsdb-server-init/0.log" Oct 09 08:50:44 crc kubenswrapper[4715]: I1009 08:50:44.775896 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gckmt_2db0914e-e011-4b76-a07d-57ce73faceaa/ovs-vswitchd/0.log" Oct 09 08:50:44 crc kubenswrapper[4715]: I1009 08:50:44.828436 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gckmt_2db0914e-e011-4b76-a07d-57ce73faceaa/ovsdb-server/0.log" Oct 09 08:50:44 crc kubenswrapper[4715]: I1009 08:50:44.975406 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-xfr2w_2a1f06ac-c4c8-4884-bb1a-360fbaf03adf/ovn-controller/0.log" Oct 09 08:50:45 crc kubenswrapper[4715]: I1009 08:50:45.140039 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-czrvv_a6388a30-80c6-412d-9c3d-9b555b215d76/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:45 crc kubenswrapper[4715]: I1009 08:50:45.196536 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2d08550c-3489-4838-9aaf-49ecbcac005b/openstack-network-exporter/0.log" Oct 09 08:50:45 crc kubenswrapper[4715]: I1009 08:50:45.251961 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2d08550c-3489-4838-9aaf-49ecbcac005b/ovn-northd/0.log" Oct 09 08:50:45 crc kubenswrapper[4715]: I1009 08:50:45.362268 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ec6c7ac4-3535-4d45-9be1-8f6b4de9670f/openstack-network-exporter/0.log" Oct 09 08:50:45 crc kubenswrapper[4715]: I1009 08:50:45.438433 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ec6c7ac4-3535-4d45-9be1-8f6b4de9670f/ovsdbserver-nb/0.log" Oct 09 08:50:45 crc kubenswrapper[4715]: I1009 08:50:45.632532 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dc229de4-d184-450e-805b-a8b616c8a60b/ovsdbserver-sb/0.log" Oct 09 08:50:45 crc kubenswrapper[4715]: I1009 08:50:45.640983 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dc229de4-d184-450e-805b-a8b616c8a60b/openstack-network-exporter/0.log" Oct 09 08:50:45 crc kubenswrapper[4715]: I1009 08:50:45.876305 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6985f6958d-wwgg5_45254b90-e09e-425e-b7c2-123813b82b37/placement-api/0.log" Oct 09 08:50:45 crc kubenswrapper[4715]: I1009 08:50:45.918379 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a2bc3ad0-34e4-4ccc-9abd-7e998940780c/setup-container/0.log" Oct 09 08:50:45 crc kubenswrapper[4715]: I1009 08:50:45.963134 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6985f6958d-wwgg5_45254b90-e09e-425e-b7c2-123813b82b37/placement-log/0.log" Oct 09 08:50:46 crc kubenswrapper[4715]: I1009 08:50:46.166669 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a2bc3ad0-34e4-4ccc-9abd-7e998940780c/rabbitmq/0.log" Oct 09 08:50:46 crc kubenswrapper[4715]: I1009 08:50:46.201888 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a2bc3ad0-34e4-4ccc-9abd-7e998940780c/setup-container/0.log" Oct 09 08:50:46 crc kubenswrapper[4715]: I1009 08:50:46.245294 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c87001f3-a098-449a-b8ec-cccb2a313d5f/setup-container/0.log" Oct 09 08:50:46 crc kubenswrapper[4715]: I1009 08:50:46.467395 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c87001f3-a098-449a-b8ec-cccb2a313d5f/setup-container/0.log" Oct 09 08:50:46 crc kubenswrapper[4715]: I1009 08:50:46.487932 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2r9hc_f0ea0eb1-5091-4178-8ae8-a39cf494915d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:46 crc kubenswrapper[4715]: I1009 08:50:46.599003 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c87001f3-a098-449a-b8ec-cccb2a313d5f/rabbitmq/0.log" Oct 09 08:50:46 crc kubenswrapper[4715]: I1009 08:50:46.711834 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-bzxls_aa76af72-003a-4682-bbee-0ef470ecef9a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:46 crc kubenswrapper[4715]: I1009 08:50:46.887329 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-t6ccc_1058ab3e-4f39-48ac-9f7e-81e40f041264/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:46 crc kubenswrapper[4715]: I1009 08:50:46.940911 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-qhl65_806359bf-f132-4db6-9795-4e180be1895a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:47 crc kubenswrapper[4715]: I1009 08:50:47.152134 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-dpkqg_0986c120-0684-4916-b753-677c2d3e6798/ssh-known-hosts-edpm-deployment/0.log" Oct 09 08:50:47 crc kubenswrapper[4715]: I1009 08:50:47.296513 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9d9d4b647-cfdjf_6f4b9cb5-f128-44e3-9142-2d39d79cb0b8/proxy-server/0.log" Oct 09 08:50:47 crc kubenswrapper[4715]: I1009 08:50:47.382959 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9d9d4b647-cfdjf_6f4b9cb5-f128-44e3-9142-2d39d79cb0b8/proxy-httpd/0.log" Oct 09 08:50:47 crc kubenswrapper[4715]: I1009 08:50:47.419024 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-gjkhs_377b8455-bb97-4be8-977a-191578be267c/swift-ring-rebalance/0.log" Oct 09 08:50:47 crc kubenswrapper[4715]: I1009 08:50:47.657443 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/account-auditor/0.log" Oct 09 08:50:47 crc kubenswrapper[4715]: I1009 08:50:47.712770 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/account-replicator/0.log" Oct 09 08:50:47 crc kubenswrapper[4715]: I1009 08:50:47.719142 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/account-reaper/0.log" Oct 09 08:50:47 crc kubenswrapper[4715]: I1009 08:50:47.769452 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/account-server/0.log" Oct 09 08:50:47 crc kubenswrapper[4715]: I1009 08:50:47.887310 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/container-auditor/0.log" Oct 09 08:50:47 crc kubenswrapper[4715]: I1009 08:50:47.935054 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/container-server/0.log" Oct 09 08:50:47 crc kubenswrapper[4715]: I1009 08:50:47.968283 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/container-updater/0.log" Oct 09 08:50:47 crc kubenswrapper[4715]: I1009 08:50:47.990789 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/container-replicator/0.log" Oct 09 08:50:48 crc kubenswrapper[4715]: I1009 08:50:48.086018 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/object-auditor/0.log" Oct 09 08:50:48 crc kubenswrapper[4715]: I1009 08:50:48.170745 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/object-expirer/0.log" Oct 09 08:50:48 crc kubenswrapper[4715]: I1009 08:50:48.213970 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/object-replicator/0.log" Oct 09 08:50:48 crc kubenswrapper[4715]: I1009 08:50:48.237469 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/object-server/0.log" Oct 09 08:50:48 crc kubenswrapper[4715]: I1009 08:50:48.359482 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/object-updater/0.log" Oct 09 08:50:48 crc kubenswrapper[4715]: I1009 08:50:48.412458 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/rsync/0.log" Oct 09 08:50:48 crc kubenswrapper[4715]: I1009 08:50:48.505253 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d6a5f2b-d77d-41c9-8b7d-e2e62c157577/swift-recon-cron/0.log" Oct 09 08:50:48 crc kubenswrapper[4715]: I1009 08:50:48.660296 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-x5wt4_943912d2-23f1-4cc8-92ab-42288a195416/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:48 crc kubenswrapper[4715]: I1009 08:50:48.696777 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_c272fa72-6434-4af1-8e2b-433cc9f619ea/tempest-tests-tempest-tests-runner/0.log" Oct 09 08:50:48 crc kubenswrapper[4715]: I1009 08:50:48.908010 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e6af6cd1-b4d1-4521-a954-460e613c51e1/test-operator-logs-container/0.log" Oct 09 08:50:48 crc kubenswrapper[4715]: I1009 08:50:48.967135 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-ddnss_3487ef30-efc9-46c5-8ed3-8146c9498ff0/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 09 08:50:52 crc kubenswrapper[4715]: I1009 08:50:52.136601 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:50:52 crc kubenswrapper[4715]: E1009 08:50:52.138087 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:51:02 crc kubenswrapper[4715]: I1009 08:51:02.931721 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5d9f0338-4450-49ca-ad02-67cdda5d323f/memcached/0.log" Oct 09 08:51:04 crc kubenswrapper[4715]: I1009 08:51:04.137390 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:51:04 crc kubenswrapper[4715]: E1009 08:51:04.137673 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:51:12 crc kubenswrapper[4715]: I1009 08:51:12.749959 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/util/0.log" Oct 09 08:51:12 crc kubenswrapper[4715]: I1009 08:51:12.903008 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/util/0.log" Oct 09 08:51:12 crc kubenswrapper[4715]: I1009 08:51:12.903751 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/pull/0.log" Oct 09 08:51:12 crc kubenswrapper[4715]: I1009 08:51:12.932642 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/pull/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.059332 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/extract/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.075631 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/util/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.133250 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4b02109cdc506d254484be3402cf52268c9d217c8c9f01502ef56cdb575tsrl_3f089e5d-d22d-45bd-8525-ff337f7db321/pull/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.222340 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-rsthg_e4603d13-cf9d-4d8d-82db-3b182aa42e74/kube-rbac-proxy/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.320010 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-rsthg_e4603d13-cf9d-4d8d-82db-3b182aa42e74/manager/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.340945 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-gqdw4_68110204-494d-4a10-b25d-0996c9dd1c6f/kube-rbac-proxy/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.456992 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-gqdw4_68110204-494d-4a10-b25d-0996c9dd1c6f/manager/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.522852 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-v8zt5_675a8b37-dcfc-414e-9218-7741ce9ec2d5/kube-rbac-proxy/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.576911 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-v8zt5_675a8b37-dcfc-414e-9218-7741ce9ec2d5/manager/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.703123 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-cfkg2_e11fc796-233e-4c17-b953-1c6211f0c679/kube-rbac-proxy/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.793683 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-cfkg2_e11fc796-233e-4c17-b953-1c6211f0c679/manager/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.858838 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-qjwrk_32b6325f-e041-492d-a113-638dcef15310/kube-rbac-proxy/0.log" Oct 09 08:51:13 crc kubenswrapper[4715]: I1009 08:51:13.884572 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-qjwrk_32b6325f-e041-492d-a113-638dcef15310/manager/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.018280 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-jps2w_2730bf5c-42b9-4739-a2bc-6250bfcb997a/kube-rbac-proxy/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.058625 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-jps2w_2730bf5c-42b9-4739-a2bc-6250bfcb997a/manager/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.175546 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-ckx8h_b39f3e52-f97a-4bf4-934d-88267bddae91/kube-rbac-proxy/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.273111 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-pxmc4_c990a4aa-4a8e-499b-bf58-99c469af523e/kube-rbac-proxy/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.371317 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-ckx8h_b39f3e52-f97a-4bf4-934d-88267bddae91/manager/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.404042 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-pxmc4_c990a4aa-4a8e-499b-bf58-99c469af523e/manager/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.497132 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-7zmwl_9657b932-fe63-4417-8463-8af21e9c9790/kube-rbac-proxy/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.598007 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-7zmwl_9657b932-fe63-4417-8463-8af21e9c9790/manager/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.650225 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-6zczp_9cae911a-5b69-4cf4-aa26-4adb4457eec4/kube-rbac-proxy/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.723531 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-6zczp_9cae911a-5b69-4cf4-aa26-4adb4457eec4/manager/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.775026 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-ggwkb_ad178d55-a5d5-40b5-9364-0a9af0718f46/kube-rbac-proxy/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.835506 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-ggwkb_ad178d55-a5d5-40b5-9364-0a9af0718f46/manager/0.log" Oct 09 08:51:14 crc kubenswrapper[4715]: I1009 08:51:14.975262 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-kqhg2_ecf88dec-957f-4221-8ded-d779392c2793/kube-rbac-proxy/0.log" Oct 09 08:51:15 crc kubenswrapper[4715]: I1009 08:51:15.009823 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-kqhg2_ecf88dec-957f-4221-8ded-d779392c2793/manager/0.log" Oct 09 08:51:15 crc kubenswrapper[4715]: I1009 08:51:15.122162 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-6xd27_6d11d372-6981-432f-a2b0-364cb9b24f63/kube-rbac-proxy/0.log" Oct 09 08:51:15 crc kubenswrapper[4715]: I1009 08:51:15.232034 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-6xd27_6d11d372-6981-432f-a2b0-364cb9b24f63/manager/0.log" Oct 09 08:51:15 crc kubenswrapper[4715]: I1009 08:51:15.249630 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-xmq4r_619ad411-d5d7-431b-9bb6-6cf084134aaf/kube-rbac-proxy/0.log" Oct 09 08:51:15 crc kubenswrapper[4715]: I1009 08:51:15.316232 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-xmq4r_619ad411-d5d7-431b-9bb6-6cf084134aaf/manager/0.log" Oct 09 08:51:15 crc kubenswrapper[4715]: I1009 08:51:15.433785 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll_e9335457-1cad-453a-9539-d73dc2c77021/kube-rbac-proxy/0.log" Oct 09 08:51:15 crc kubenswrapper[4715]: I1009 08:51:15.439615 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757dl2tll_e9335457-1cad-453a-9539-d73dc2c77021/manager/0.log" Oct 09 08:51:15 crc kubenswrapper[4715]: I1009 08:51:15.610979 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b6844c9b7-z7vpp_fd59cd6f-8b57-4377-80ae-a1873494f103/kube-rbac-proxy/0.log" Oct 09 08:51:15 crc kubenswrapper[4715]: I1009 08:51:15.718690 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-75c7986888-vmsr4_2f70ba87-a4dd-4a97-a005-f63fec497e9f/kube-rbac-proxy/0.log" Oct 09 08:51:15 crc kubenswrapper[4715]: I1009 08:51:15.969349 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fgdkg_9809a400-b7e4-4700-bfb6-3500d2f61c96/registry-server/0.log" Oct 09 08:51:15 crc kubenswrapper[4715]: I1009 08:51:15.999959 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-75c7986888-vmsr4_2f70ba87-a4dd-4a97-a005-f63fec497e9f/operator/0.log" Oct 09 08:51:16 crc kubenswrapper[4715]: I1009 08:51:16.136858 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:51:16 crc kubenswrapper[4715]: E1009 08:51:16.137068 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:51:16 crc kubenswrapper[4715]: I1009 08:51:16.181798 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-5plxn_6d1ea812-36f3-4478-9b78-aed194390313/kube-rbac-proxy/0.log" Oct 09 08:51:16 crc kubenswrapper[4715]: I1009 08:51:16.278974 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f96f8c84-5plxn_6d1ea812-36f3-4478-9b78-aed194390313/manager/0.log" Oct 09 08:51:16 crc kubenswrapper[4715]: I1009 08:51:16.433038 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-xwt7q_fc37f3a9-94a5-4957-939a-a0b0a7a567bb/kube-rbac-proxy/0.log" Oct 09 08:51:16 crc kubenswrapper[4715]: I1009 08:51:16.499163 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-xwt7q_fc37f3a9-94a5-4957-939a-a0b0a7a567bb/manager/0.log" Oct 09 08:51:16 crc kubenswrapper[4715]: I1009 08:51:16.630577 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-vqb9s_48619024-da5f-4b28-8724-3707961de8ce/operator/0.log" Oct 09 08:51:16 crc kubenswrapper[4715]: I1009 08:51:16.722483 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-rcm5h_4b8010cb-d8af-4b7c-9530-fe143bbf1ddb/kube-rbac-proxy/0.log" Oct 09 08:51:16 crc kubenswrapper[4715]: I1009 08:51:16.746487 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-rcm5h_4b8010cb-d8af-4b7c-9530-fe143bbf1ddb/manager/0.log" Oct 09 08:51:16 crc kubenswrapper[4715]: I1009 08:51:16.908432 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6648b66598-cvsqm_9b402da9-cbb2-473b-beee-7064e06acb73/kube-rbac-proxy/0.log" Oct 09 08:51:17 crc kubenswrapper[4715]: I1009 08:51:17.028368 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6648b66598-cvsqm_9b402da9-cbb2-473b-beee-7064e06acb73/manager/0.log" Oct 09 08:51:17 crc kubenswrapper[4715]: I1009 08:51:17.036740 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b6844c9b7-z7vpp_fd59cd6f-8b57-4377-80ae-a1873494f103/manager/0.log" Oct 09 08:51:17 crc kubenswrapper[4715]: I1009 08:51:17.097888 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-rt6nt_ba259ec1-9157-4cd9-8c21-11915efe5dde/kube-rbac-proxy/0.log" Oct 09 08:51:17 crc kubenswrapper[4715]: I1009 08:51:17.216177 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-rt6nt_ba259ec1-9157-4cd9-8c21-11915efe5dde/manager/0.log" Oct 09 08:51:17 crc kubenswrapper[4715]: I1009 08:51:17.358540 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-bgwc8_666e7073-bf77-46f3-99da-5ad2013835a9/kube-rbac-proxy/0.log" Oct 09 08:51:17 crc kubenswrapper[4715]: I1009 08:51:17.359290 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-646675d848-bgwc8_666e7073-bf77-46f3-99da-5ad2013835a9/manager/0.log" Oct 09 08:51:27 crc kubenswrapper[4715]: I1009 08:51:27.136926 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:51:27 crc kubenswrapper[4715]: E1009 08:51:27.137730 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:51:32 crc kubenswrapper[4715]: I1009 08:51:32.358222 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-prhvm_ad01aea7-211a-4ff5-b15b-fb696917dc52/control-plane-machine-set-operator/0.log" Oct 09 08:51:32 crc kubenswrapper[4715]: I1009 08:51:32.497037 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5tfh2_93259cc2-6847-41dc-a61d-83e7b9e67f3a/kube-rbac-proxy/0.log" Oct 09 08:51:32 crc kubenswrapper[4715]: I1009 08:51:32.523644 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-5tfh2_93259cc2-6847-41dc-a61d-83e7b9e67f3a/machine-api-operator/0.log" Oct 09 08:51:42 crc kubenswrapper[4715]: I1009 08:51:42.137208 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:51:42 crc kubenswrapper[4715]: E1009 08:51:42.138071 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:51:43 crc kubenswrapper[4715]: I1009 08:51:43.915236 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-vxwbp_b46c8c22-ef65-4617-90a9-bcef0954a010/cert-manager-controller/0.log" Oct 09 08:51:44 crc kubenswrapper[4715]: I1009 08:51:44.169706 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-4sgtl_9c7c75d0-8444-4edb-b653-5bc079b11d51/cert-manager-cainjector/0.log" Oct 09 08:51:44 crc kubenswrapper[4715]: I1009 08:51:44.305308 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-27vx6_acf4388b-06e6-4576-ae4d-b67ccda0c1ac/cert-manager-webhook/0.log" Oct 09 08:51:53 crc kubenswrapper[4715]: I1009 08:51:53.137521 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:51:53 crc kubenswrapper[4715]: E1009 08:51:53.138269 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:51:55 crc kubenswrapper[4715]: I1009 08:51:55.670769 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-4pcfg_02f7ed0f-b6c8-412c-b89a-a0a42d82a72d/nmstate-console-plugin/0.log" Oct 09 08:51:55 crc kubenswrapper[4715]: I1009 08:51:55.842473 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-x6h8j_4d966a8c-9962-4542-9e72-fbd4959508e6/nmstate-handler/0.log" Oct 09 08:51:55 crc kubenswrapper[4715]: I1009 08:51:55.872327 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-cvrwm_264aef9d-e55d-41e9-b2e4-055db900d371/kube-rbac-proxy/0.log" Oct 09 08:51:55 crc kubenswrapper[4715]: I1009 08:51:55.916562 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-cvrwm_264aef9d-e55d-41e9-b2e4-055db900d371/nmstate-metrics/0.log" Oct 09 08:51:56 crc kubenswrapper[4715]: I1009 08:51:56.051173 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-g94hr_c66a28bc-b6c9-426c-bc4d-b8748836b175/nmstate-operator/0.log" Oct 09 08:51:56 crc kubenswrapper[4715]: I1009 08:51:56.093500 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-bk7lw_f8f2e4a6-72f9-4eb0-9654-e07d0e3b3bf6/nmstate-webhook/0.log" Oct 09 08:52:04 crc kubenswrapper[4715]: I1009 08:52:04.136961 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:52:04 crc kubenswrapper[4715]: E1009 08:52:04.137766 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:52:08 crc kubenswrapper[4715]: I1009 08:52:08.908651 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-plvwr_dc22841d-c047-4e47-a235-9025efe5d30e/kube-rbac-proxy/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.036183 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-plvwr_dc22841d-c047-4e47-a235-9025efe5d30e/controller/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.132079 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-frr-files/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.256856 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-reloader/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.264958 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-metrics/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.299667 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-frr-files/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.309073 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-reloader/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.497984 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-reloader/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.549972 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-frr-files/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.550016 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-metrics/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.646251 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-metrics/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.863255 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-metrics/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.907193 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-frr-files/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.941011 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/cp-reloader/0.log" Oct 09 08:52:09 crc kubenswrapper[4715]: I1009 08:52:09.945811 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/controller/0.log" Oct 09 08:52:10 crc kubenswrapper[4715]: I1009 08:52:10.105041 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/frr-metrics/0.log" Oct 09 08:52:10 crc kubenswrapper[4715]: I1009 08:52:10.139922 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/kube-rbac-proxy-frr/0.log" Oct 09 08:52:10 crc kubenswrapper[4715]: I1009 08:52:10.171064 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/kube-rbac-proxy/0.log" Oct 09 08:52:10 crc kubenswrapper[4715]: I1009 08:52:10.710585 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/reloader/0.log" Oct 09 08:52:10 crc kubenswrapper[4715]: I1009 08:52:10.758889 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-fs56x_01a59281-8feb-446a-b861-fba9e4e8df7d/frr-k8s-webhook-server/0.log" Oct 09 08:52:10 crc kubenswrapper[4715]: I1009 08:52:10.980521 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-798678874c-cjtmr_6401cdb7-5b3a-4ae5-8944-fc923060aa09/manager/0.log" Oct 09 08:52:11 crc kubenswrapper[4715]: I1009 08:52:11.201706 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-865475978c-t8h2f_2b80db73-440c-49ef-8b24-187a67aab5db/webhook-server/0.log" Oct 09 08:52:11 crc kubenswrapper[4715]: I1009 08:52:11.298234 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-js45j_a5b391f0-e6c7-412a-8333-530a9ad5bab3/kube-rbac-proxy/0.log" Oct 09 08:52:11 crc kubenswrapper[4715]: I1009 08:52:11.320190 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vznzc_4e5fb36f-654e-4ab2-a2f5-0f293bd9c0d7/frr/0.log" Oct 09 08:52:11 crc kubenswrapper[4715]: I1009 08:52:11.659966 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-js45j_a5b391f0-e6c7-412a-8333-530a9ad5bab3/speaker/0.log" Oct 09 08:52:18 crc kubenswrapper[4715]: I1009 08:52:18.137142 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:52:18 crc kubenswrapper[4715]: E1009 08:52:18.138093 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:52:23 crc kubenswrapper[4715]: I1009 08:52:23.596029 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/util/0.log" Oct 09 08:52:23 crc kubenswrapper[4715]: I1009 08:52:23.735760 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/util/0.log" Oct 09 08:52:23 crc kubenswrapper[4715]: I1009 08:52:23.760092 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/pull/0.log" Oct 09 08:52:23 crc kubenswrapper[4715]: I1009 08:52:23.769605 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/pull/0.log" Oct 09 08:52:23 crc kubenswrapper[4715]: I1009 08:52:23.938206 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/extract/0.log" Oct 09 08:52:23 crc kubenswrapper[4715]: I1009 08:52:23.972823 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/pull/0.log" Oct 09 08:52:23 crc kubenswrapper[4715]: I1009 08:52:23.982732 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bkcr_f0cfb14d-8aa5-4841-9f91-dd632d372e18/util/0.log" Oct 09 08:52:24 crc kubenswrapper[4715]: I1009 08:52:24.107820 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/extract-utilities/0.log" Oct 09 08:52:24 crc kubenswrapper[4715]: I1009 08:52:24.262686 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/extract-content/0.log" Oct 09 08:52:24 crc kubenswrapper[4715]: I1009 08:52:24.274313 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/extract-content/0.log" Oct 09 08:52:24 crc kubenswrapper[4715]: I1009 08:52:24.278792 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/extract-utilities/0.log" Oct 09 08:52:24 crc kubenswrapper[4715]: I1009 08:52:24.454819 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/extract-utilities/0.log" Oct 09 08:52:24 crc kubenswrapper[4715]: I1009 08:52:24.490049 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/extract-content/0.log" Oct 09 08:52:24 crc kubenswrapper[4715]: I1009 08:52:24.701606 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/extract-utilities/0.log" Oct 09 08:52:24 crc kubenswrapper[4715]: I1009 08:52:24.710065 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-chtln_af6becdc-9c8d-49f2-ae9c-a58edc359d1f/registry-server/0.log" Oct 09 08:52:24 crc kubenswrapper[4715]: I1009 08:52:24.823825 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/extract-utilities/0.log" Oct 09 08:52:24 crc kubenswrapper[4715]: I1009 08:52:24.870380 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/extract-content/0.log" Oct 09 08:52:24 crc kubenswrapper[4715]: I1009 08:52:24.875461 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/extract-content/0.log" Oct 09 08:52:25 crc kubenswrapper[4715]: I1009 08:52:25.028394 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/extract-utilities/0.log" Oct 09 08:52:25 crc kubenswrapper[4715]: I1009 08:52:25.120440 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/extract-content/0.log" Oct 09 08:52:25 crc kubenswrapper[4715]: I1009 08:52:25.269459 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/util/0.log" Oct 09 08:52:25 crc kubenswrapper[4715]: I1009 08:52:25.474282 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/util/0.log" Oct 09 08:52:25 crc kubenswrapper[4715]: I1009 08:52:25.486056 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/pull/0.log" Oct 09 08:52:25 crc kubenswrapper[4715]: I1009 08:52:25.524716 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/pull/0.log" Oct 09 08:52:25 crc kubenswrapper[4715]: I1009 08:52:25.682129 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/util/0.log" Oct 09 08:52:25 crc kubenswrapper[4715]: I1009 08:52:25.716169 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6j97m_1c8e4f51-54cf-4545-8a89-8ccaf52c55fc/registry-server/0.log" Oct 09 08:52:25 crc kubenswrapper[4715]: I1009 08:52:25.732939 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/extract/0.log" Oct 09 08:52:25 crc kubenswrapper[4715]: I1009 08:52:25.733892 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cgmkqf_44e9a431-3bec-4439-9df7-a7f12d65dad2/pull/0.log" Oct 09 08:52:25 crc kubenswrapper[4715]: I1009 08:52:25.964025 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/extract-utilities/0.log" Oct 09 08:52:25 crc kubenswrapper[4715]: I1009 08:52:25.976371 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rpvsg_06c9829f-1dca-4ef6-a34f-a5380dfd729c/marketplace-operator/0.log" Oct 09 08:52:26 crc kubenswrapper[4715]: I1009 08:52:26.124380 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/extract-utilities/0.log" Oct 09 08:52:26 crc kubenswrapper[4715]: I1009 08:52:26.168700 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/extract-content/0.log" Oct 09 08:52:26 crc kubenswrapper[4715]: I1009 08:52:26.168913 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/extract-content/0.log" Oct 09 08:52:26 crc kubenswrapper[4715]: I1009 08:52:26.855703 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/extract-content/0.log" Oct 09 08:52:26 crc kubenswrapper[4715]: I1009 08:52:26.862534 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/extract-utilities/0.log" Oct 09 08:52:27 crc kubenswrapper[4715]: I1009 08:52:27.025861 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vpkg_26a05949-7b09-4412-a6ae-004009c0c4bf/registry-server/0.log" Oct 09 08:52:27 crc kubenswrapper[4715]: I1009 08:52:27.081678 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/extract-utilities/0.log" Oct 09 08:52:27 crc kubenswrapper[4715]: I1009 08:52:27.239094 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/extract-content/0.log" Oct 09 08:52:27 crc kubenswrapper[4715]: I1009 08:52:27.257144 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/extract-content/0.log" Oct 09 08:52:27 crc kubenswrapper[4715]: I1009 08:52:27.258968 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/extract-utilities/0.log" Oct 09 08:52:27 crc kubenswrapper[4715]: I1009 08:52:27.427471 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/extract-content/0.log" Oct 09 08:52:27 crc kubenswrapper[4715]: I1009 08:52:27.436661 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/extract-utilities/0.log" Oct 09 08:52:27 crc kubenswrapper[4715]: I1009 08:52:27.963622 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-g5kq2_efd04f5e-635d-422b-ae2a-38096e0ecc44/registry-server/0.log" Oct 09 08:52:30 crc kubenswrapper[4715]: I1009 08:52:30.145641 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:52:30 crc kubenswrapper[4715]: E1009 08:52:30.146278 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:52:42 crc kubenswrapper[4715]: I1009 08:52:42.137193 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:52:42 crc kubenswrapper[4715]: E1009 08:52:42.137841 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.614572 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xbgbl"] Oct 09 08:52:52 crc kubenswrapper[4715]: E1009 08:52:52.615347 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ff4e22-982b-45e8-944d-87f907cfcd35" containerName="container-00" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.615360 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ff4e22-982b-45e8-944d-87f907cfcd35" containerName="container-00" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.615579 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ff4e22-982b-45e8-944d-87f907cfcd35" containerName="container-00" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.616852 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.639957 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xbgbl"] Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.743331 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-catalog-content\") pod \"certified-operators-xbgbl\" (UID: \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\") " pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.743392 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnh4w\" (UniqueName: \"kubernetes.io/projected/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-kube-api-access-lnh4w\") pod \"certified-operators-xbgbl\" (UID: \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\") " pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.743478 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-utilities\") pod \"certified-operators-xbgbl\" (UID: \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\") " pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.845627 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-catalog-content\") pod \"certified-operators-xbgbl\" (UID: \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\") " pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.845711 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnh4w\" (UniqueName: \"kubernetes.io/projected/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-kube-api-access-lnh4w\") pod \"certified-operators-xbgbl\" (UID: \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\") " pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.845806 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-utilities\") pod \"certified-operators-xbgbl\" (UID: \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\") " pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.846397 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-utilities\") pod \"certified-operators-xbgbl\" (UID: \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\") " pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.846685 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-catalog-content\") pod \"certified-operators-xbgbl\" (UID: \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\") " pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.870512 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnh4w\" (UniqueName: \"kubernetes.io/projected/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-kube-api-access-lnh4w\") pod \"certified-operators-xbgbl\" (UID: \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\") " pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:52:52 crc kubenswrapper[4715]: I1009 08:52:52.945912 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:52:53 crc kubenswrapper[4715]: I1009 08:52:53.145472 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:52:53 crc kubenswrapper[4715]: E1009 08:52:53.145845 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:52:54 crc kubenswrapper[4715]: I1009 08:52:54.087794 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xbgbl"] Oct 09 08:52:54 crc kubenswrapper[4715]: I1009 08:52:54.732344 4715 generic.go:334] "Generic (PLEG): container finished" podID="b923f7f1-04bf-4b00-83cb-7adfcd8dae54" containerID="564b58b6673413fd9c8c2a6597cbf5b78f658c2e0544bf379f46f0092d9bbf53" exitCode=0 Oct 09 08:52:54 crc kubenswrapper[4715]: I1009 08:52:54.732385 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbgbl" event={"ID":"b923f7f1-04bf-4b00-83cb-7adfcd8dae54","Type":"ContainerDied","Data":"564b58b6673413fd9c8c2a6597cbf5b78f658c2e0544bf379f46f0092d9bbf53"} Oct 09 08:52:54 crc kubenswrapper[4715]: I1009 08:52:54.732619 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbgbl" event={"ID":"b923f7f1-04bf-4b00-83cb-7adfcd8dae54","Type":"ContainerStarted","Data":"8e5f8f51592c0e5725389ea114ef882bc91f78102c5119ca24ee7314081d70a4"} Oct 09 08:52:54 crc kubenswrapper[4715]: I1009 08:52:54.734609 4715 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 09 08:52:55 crc kubenswrapper[4715]: I1009 08:52:55.744301 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbgbl" event={"ID":"b923f7f1-04bf-4b00-83cb-7adfcd8dae54","Type":"ContainerStarted","Data":"bbf607309fe1378bc01aba29b23fcaa2bc6d2f5dff18e6df9cea15117f82595c"} Oct 09 08:52:57 crc kubenswrapper[4715]: I1009 08:52:57.763057 4715 generic.go:334] "Generic (PLEG): container finished" podID="b923f7f1-04bf-4b00-83cb-7adfcd8dae54" containerID="bbf607309fe1378bc01aba29b23fcaa2bc6d2f5dff18e6df9cea15117f82595c" exitCode=0 Oct 09 08:52:57 crc kubenswrapper[4715]: I1009 08:52:57.763157 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbgbl" event={"ID":"b923f7f1-04bf-4b00-83cb-7adfcd8dae54","Type":"ContainerDied","Data":"bbf607309fe1378bc01aba29b23fcaa2bc6d2f5dff18e6df9cea15117f82595c"} Oct 09 08:52:58 crc kubenswrapper[4715]: I1009 08:52:58.775027 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbgbl" event={"ID":"b923f7f1-04bf-4b00-83cb-7adfcd8dae54","Type":"ContainerStarted","Data":"07d68321c52e55b7013664fc2cda33b6eb093f7fee126bf5c3df37710392ed37"} Oct 09 08:52:58 crc kubenswrapper[4715]: I1009 08:52:58.800112 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xbgbl" podStartSLOduration=3.139208193 podStartE2EDuration="6.800087024s" podCreationTimestamp="2025-10-09 08:52:52 +0000 UTC" firstStartedPulling="2025-10-09 08:52:54.734278011 +0000 UTC m=+4005.427082019" lastFinishedPulling="2025-10-09 08:52:58.395156842 +0000 UTC m=+4009.087960850" observedRunningTime="2025-10-09 08:52:58.793504145 +0000 UTC m=+4009.486308164" watchObservedRunningTime="2025-10-09 08:52:58.800087024 +0000 UTC m=+4009.492891042" Oct 09 08:53:02 crc kubenswrapper[4715]: I1009 08:53:02.946517 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:53:02 crc kubenswrapper[4715]: I1009 08:53:02.947117 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:53:02 crc kubenswrapper[4715]: I1009 08:53:02.992874 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:53:03 crc kubenswrapper[4715]: I1009 08:53:03.881792 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:53:03 crc kubenswrapper[4715]: I1009 08:53:03.935255 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xbgbl"] Oct 09 08:53:04 crc kubenswrapper[4715]: I1009 08:53:04.137260 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:53:04 crc kubenswrapper[4715]: E1009 08:53:04.137543 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:53:05 crc kubenswrapper[4715]: I1009 08:53:05.845005 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xbgbl" podUID="b923f7f1-04bf-4b00-83cb-7adfcd8dae54" containerName="registry-server" containerID="cri-o://07d68321c52e55b7013664fc2cda33b6eb093f7fee126bf5c3df37710392ed37" gracePeriod=2 Oct 09 08:53:06 crc kubenswrapper[4715]: I1009 08:53:06.858406 4715 generic.go:334] "Generic (PLEG): container finished" podID="b923f7f1-04bf-4b00-83cb-7adfcd8dae54" containerID="07d68321c52e55b7013664fc2cda33b6eb093f7fee126bf5c3df37710392ed37" exitCode=0 Oct 09 08:53:06 crc kubenswrapper[4715]: I1009 08:53:06.858453 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbgbl" event={"ID":"b923f7f1-04bf-4b00-83cb-7adfcd8dae54","Type":"ContainerDied","Data":"07d68321c52e55b7013664fc2cda33b6eb093f7fee126bf5c3df37710392ed37"} Oct 09 08:53:06 crc kubenswrapper[4715]: I1009 08:53:06.858862 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xbgbl" event={"ID":"b923f7f1-04bf-4b00-83cb-7adfcd8dae54","Type":"ContainerDied","Data":"8e5f8f51592c0e5725389ea114ef882bc91f78102c5119ca24ee7314081d70a4"} Oct 09 08:53:06 crc kubenswrapper[4715]: I1009 08:53:06.858883 4715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e5f8f51592c0e5725389ea114ef882bc91f78102c5119ca24ee7314081d70a4" Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.159692 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.221481 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnh4w\" (UniqueName: \"kubernetes.io/projected/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-kube-api-access-lnh4w\") pod \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\" (UID: \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\") " Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.221731 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-utilities\") pod \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\" (UID: \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\") " Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.221972 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-catalog-content\") pod \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\" (UID: \"b923f7f1-04bf-4b00-83cb-7adfcd8dae54\") " Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.223084 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-utilities" (OuterVolumeSpecName: "utilities") pod "b923f7f1-04bf-4b00-83cb-7adfcd8dae54" (UID: "b923f7f1-04bf-4b00-83cb-7adfcd8dae54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.227989 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-kube-api-access-lnh4w" (OuterVolumeSpecName: "kube-api-access-lnh4w") pod "b923f7f1-04bf-4b00-83cb-7adfcd8dae54" (UID: "b923f7f1-04bf-4b00-83cb-7adfcd8dae54"). InnerVolumeSpecName "kube-api-access-lnh4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.297339 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b923f7f1-04bf-4b00-83cb-7adfcd8dae54" (UID: "b923f7f1-04bf-4b00-83cb-7adfcd8dae54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.325919 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.325980 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnh4w\" (UniqueName: \"kubernetes.io/projected/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-kube-api-access-lnh4w\") on node \"crc\" DevicePath \"\"" Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.326003 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b923f7f1-04bf-4b00-83cb-7adfcd8dae54-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.868930 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xbgbl" Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.905980 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xbgbl"] Oct 09 08:53:07 crc kubenswrapper[4715]: I1009 08:53:07.913595 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xbgbl"] Oct 09 08:53:08 crc kubenswrapper[4715]: I1009 08:53:08.156837 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b923f7f1-04bf-4b00-83cb-7adfcd8dae54" path="/var/lib/kubelet/pods/b923f7f1-04bf-4b00-83cb-7adfcd8dae54/volumes" Oct 09 08:53:17 crc kubenswrapper[4715]: I1009 08:53:17.137726 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:53:17 crc kubenswrapper[4715]: E1009 08:53:17.138895 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.750379 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t7wpx"] Oct 09 08:53:20 crc kubenswrapper[4715]: E1009 08:53:20.751593 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b923f7f1-04bf-4b00-83cb-7adfcd8dae54" containerName="extract-content" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.751633 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b923f7f1-04bf-4b00-83cb-7adfcd8dae54" containerName="extract-content" Oct 09 08:53:20 crc kubenswrapper[4715]: E1009 08:53:20.751699 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b923f7f1-04bf-4b00-83cb-7adfcd8dae54" containerName="registry-server" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.751707 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b923f7f1-04bf-4b00-83cb-7adfcd8dae54" containerName="registry-server" Oct 09 08:53:20 crc kubenswrapper[4715]: E1009 08:53:20.751729 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b923f7f1-04bf-4b00-83cb-7adfcd8dae54" containerName="extract-utilities" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.751738 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b923f7f1-04bf-4b00-83cb-7adfcd8dae54" containerName="extract-utilities" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.752116 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b923f7f1-04bf-4b00-83cb-7adfcd8dae54" containerName="registry-server" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.753627 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.778035 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t7wpx"] Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.894545 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24eff42-2f16-4313-af4d-87f95e1f8741-catalog-content\") pod \"redhat-operators-t7wpx\" (UID: \"b24eff42-2f16-4313-af4d-87f95e1f8741\") " pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.894585 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24eff42-2f16-4313-af4d-87f95e1f8741-utilities\") pod \"redhat-operators-t7wpx\" (UID: \"b24eff42-2f16-4313-af4d-87f95e1f8741\") " pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.894640 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb8vh\" (UniqueName: \"kubernetes.io/projected/b24eff42-2f16-4313-af4d-87f95e1f8741-kube-api-access-pb8vh\") pod \"redhat-operators-t7wpx\" (UID: \"b24eff42-2f16-4313-af4d-87f95e1f8741\") " pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.997484 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24eff42-2f16-4313-af4d-87f95e1f8741-catalog-content\") pod \"redhat-operators-t7wpx\" (UID: \"b24eff42-2f16-4313-af4d-87f95e1f8741\") " pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.997534 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24eff42-2f16-4313-af4d-87f95e1f8741-utilities\") pod \"redhat-operators-t7wpx\" (UID: \"b24eff42-2f16-4313-af4d-87f95e1f8741\") " pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.997599 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb8vh\" (UniqueName: \"kubernetes.io/projected/b24eff42-2f16-4313-af4d-87f95e1f8741-kube-api-access-pb8vh\") pod \"redhat-operators-t7wpx\" (UID: \"b24eff42-2f16-4313-af4d-87f95e1f8741\") " pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.998410 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24eff42-2f16-4313-af4d-87f95e1f8741-utilities\") pod \"redhat-operators-t7wpx\" (UID: \"b24eff42-2f16-4313-af4d-87f95e1f8741\") " pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:20 crc kubenswrapper[4715]: I1009 08:53:20.998724 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24eff42-2f16-4313-af4d-87f95e1f8741-catalog-content\") pod \"redhat-operators-t7wpx\" (UID: \"b24eff42-2f16-4313-af4d-87f95e1f8741\") " pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:21 crc kubenswrapper[4715]: I1009 08:53:21.016749 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb8vh\" (UniqueName: \"kubernetes.io/projected/b24eff42-2f16-4313-af4d-87f95e1f8741-kube-api-access-pb8vh\") pod \"redhat-operators-t7wpx\" (UID: \"b24eff42-2f16-4313-af4d-87f95e1f8741\") " pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:21 crc kubenswrapper[4715]: I1009 08:53:21.078511 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:21 crc kubenswrapper[4715]: I1009 08:53:21.530212 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t7wpx"] Oct 09 08:53:22 crc kubenswrapper[4715]: I1009 08:53:22.002103 4715 generic.go:334] "Generic (PLEG): container finished" podID="b24eff42-2f16-4313-af4d-87f95e1f8741" containerID="e93b09cf7d23012da28f992ff268193397be8f4147e7a8304d6da69bfa174f02" exitCode=0 Oct 09 08:53:22 crc kubenswrapper[4715]: I1009 08:53:22.002189 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wpx" event={"ID":"b24eff42-2f16-4313-af4d-87f95e1f8741","Type":"ContainerDied","Data":"e93b09cf7d23012da28f992ff268193397be8f4147e7a8304d6da69bfa174f02"} Oct 09 08:53:22 crc kubenswrapper[4715]: I1009 08:53:22.002373 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wpx" event={"ID":"b24eff42-2f16-4313-af4d-87f95e1f8741","Type":"ContainerStarted","Data":"6dd49592ee1583311234f413e633daa844ed2bd1aa0dbcccc77a2b927095a2ad"} Oct 09 08:53:24 crc kubenswrapper[4715]: I1009 08:53:24.021236 4715 generic.go:334] "Generic (PLEG): container finished" podID="b24eff42-2f16-4313-af4d-87f95e1f8741" containerID="577f4a12c365b3d8d4923dc8c2b29d326e941ac80ac080dec98f2f06c4d9d982" exitCode=0 Oct 09 08:53:24 crc kubenswrapper[4715]: I1009 08:53:24.021287 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wpx" event={"ID":"b24eff42-2f16-4313-af4d-87f95e1f8741","Type":"ContainerDied","Data":"577f4a12c365b3d8d4923dc8c2b29d326e941ac80ac080dec98f2f06c4d9d982"} Oct 09 08:53:25 crc kubenswrapper[4715]: I1009 08:53:25.042202 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wpx" event={"ID":"b24eff42-2f16-4313-af4d-87f95e1f8741","Type":"ContainerStarted","Data":"6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924"} Oct 09 08:53:25 crc kubenswrapper[4715]: I1009 08:53:25.072406 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t7wpx" podStartSLOduration=2.597150325 podStartE2EDuration="5.07238333s" podCreationTimestamp="2025-10-09 08:53:20 +0000 UTC" firstStartedPulling="2025-10-09 08:53:22.004564699 +0000 UTC m=+4032.697368707" lastFinishedPulling="2025-10-09 08:53:24.479797704 +0000 UTC m=+4035.172601712" observedRunningTime="2025-10-09 08:53:25.067546712 +0000 UTC m=+4035.760350720" watchObservedRunningTime="2025-10-09 08:53:25.07238333 +0000 UTC m=+4035.765187348" Oct 09 08:53:31 crc kubenswrapper[4715]: I1009 08:53:31.080162 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:31 crc kubenswrapper[4715]: I1009 08:53:31.080754 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:31 crc kubenswrapper[4715]: I1009 08:53:31.122253 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:31 crc kubenswrapper[4715]: I1009 08:53:31.137509 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:53:31 crc kubenswrapper[4715]: E1009 08:53:31.137848 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:53:31 crc kubenswrapper[4715]: I1009 08:53:31.189241 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:31 crc kubenswrapper[4715]: I1009 08:53:31.371820 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t7wpx"] Oct 09 08:53:33 crc kubenswrapper[4715]: I1009 08:53:33.127709 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t7wpx" podUID="b24eff42-2f16-4313-af4d-87f95e1f8741" containerName="registry-server" containerID="cri-o://6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924" gracePeriod=2 Oct 09 08:53:33 crc kubenswrapper[4715]: I1009 08:53:33.570832 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:33 crc kubenswrapper[4715]: I1009 08:53:33.749961 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb8vh\" (UniqueName: \"kubernetes.io/projected/b24eff42-2f16-4313-af4d-87f95e1f8741-kube-api-access-pb8vh\") pod \"b24eff42-2f16-4313-af4d-87f95e1f8741\" (UID: \"b24eff42-2f16-4313-af4d-87f95e1f8741\") " Oct 09 08:53:33 crc kubenswrapper[4715]: I1009 08:53:33.750079 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24eff42-2f16-4313-af4d-87f95e1f8741-utilities\") pod \"b24eff42-2f16-4313-af4d-87f95e1f8741\" (UID: \"b24eff42-2f16-4313-af4d-87f95e1f8741\") " Oct 09 08:53:33 crc kubenswrapper[4715]: I1009 08:53:33.750114 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24eff42-2f16-4313-af4d-87f95e1f8741-catalog-content\") pod \"b24eff42-2f16-4313-af4d-87f95e1f8741\" (UID: \"b24eff42-2f16-4313-af4d-87f95e1f8741\") " Oct 09 08:53:33 crc kubenswrapper[4715]: I1009 08:53:33.757721 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24eff42-2f16-4313-af4d-87f95e1f8741-utilities" (OuterVolumeSpecName: "utilities") pod "b24eff42-2f16-4313-af4d-87f95e1f8741" (UID: "b24eff42-2f16-4313-af4d-87f95e1f8741"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:53:33 crc kubenswrapper[4715]: I1009 08:53:33.763397 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24eff42-2f16-4313-af4d-87f95e1f8741-kube-api-access-pb8vh" (OuterVolumeSpecName: "kube-api-access-pb8vh") pod "b24eff42-2f16-4313-af4d-87f95e1f8741" (UID: "b24eff42-2f16-4313-af4d-87f95e1f8741"). InnerVolumeSpecName "kube-api-access-pb8vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:53:33 crc kubenswrapper[4715]: I1009 08:53:33.853618 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb8vh\" (UniqueName: \"kubernetes.io/projected/b24eff42-2f16-4313-af4d-87f95e1f8741-kube-api-access-pb8vh\") on node \"crc\" DevicePath \"\"" Oct 09 08:53:33 crc kubenswrapper[4715]: I1009 08:53:33.853676 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b24eff42-2f16-4313-af4d-87f95e1f8741-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.143168 4715 generic.go:334] "Generic (PLEG): container finished" podID="b24eff42-2f16-4313-af4d-87f95e1f8741" containerID="6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924" exitCode=0 Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.143265 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t7wpx" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.156386 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wpx" event={"ID":"b24eff42-2f16-4313-af4d-87f95e1f8741","Type":"ContainerDied","Data":"6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924"} Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.156502 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t7wpx" event={"ID":"b24eff42-2f16-4313-af4d-87f95e1f8741","Type":"ContainerDied","Data":"6dd49592ee1583311234f413e633daa844ed2bd1aa0dbcccc77a2b927095a2ad"} Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.156534 4715 scope.go:117] "RemoveContainer" containerID="6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.181045 4715 scope.go:117] "RemoveContainer" containerID="577f4a12c365b3d8d4923dc8c2b29d326e941ac80ac080dec98f2f06c4d9d982" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.223253 4715 scope.go:117] "RemoveContainer" containerID="e93b09cf7d23012da28f992ff268193397be8f4147e7a8304d6da69bfa174f02" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.280659 4715 scope.go:117] "RemoveContainer" containerID="6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924" Oct 09 08:53:34 crc kubenswrapper[4715]: E1009 08:53:34.281264 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924\": container with ID starting with 6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924 not found: ID does not exist" containerID="6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.281315 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924"} err="failed to get container status \"6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924\": rpc error: code = NotFound desc = could not find container \"6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924\": container with ID starting with 6a4a6a660b91d274a1993ff6143b75f10675288741668c1b1a2096b55057f924 not found: ID does not exist" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.281348 4715 scope.go:117] "RemoveContainer" containerID="577f4a12c365b3d8d4923dc8c2b29d326e941ac80ac080dec98f2f06c4d9d982" Oct 09 08:53:34 crc kubenswrapper[4715]: E1009 08:53:34.281647 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"577f4a12c365b3d8d4923dc8c2b29d326e941ac80ac080dec98f2f06c4d9d982\": container with ID starting with 577f4a12c365b3d8d4923dc8c2b29d326e941ac80ac080dec98f2f06c4d9d982 not found: ID does not exist" containerID="577f4a12c365b3d8d4923dc8c2b29d326e941ac80ac080dec98f2f06c4d9d982" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.281675 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"577f4a12c365b3d8d4923dc8c2b29d326e941ac80ac080dec98f2f06c4d9d982"} err="failed to get container status \"577f4a12c365b3d8d4923dc8c2b29d326e941ac80ac080dec98f2f06c4d9d982\": rpc error: code = NotFound desc = could not find container \"577f4a12c365b3d8d4923dc8c2b29d326e941ac80ac080dec98f2f06c4d9d982\": container with ID starting with 577f4a12c365b3d8d4923dc8c2b29d326e941ac80ac080dec98f2f06c4d9d982 not found: ID does not exist" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.281703 4715 scope.go:117] "RemoveContainer" containerID="e93b09cf7d23012da28f992ff268193397be8f4147e7a8304d6da69bfa174f02" Oct 09 08:53:34 crc kubenswrapper[4715]: E1009 08:53:34.281921 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e93b09cf7d23012da28f992ff268193397be8f4147e7a8304d6da69bfa174f02\": container with ID starting with e93b09cf7d23012da28f992ff268193397be8f4147e7a8304d6da69bfa174f02 not found: ID does not exist" containerID="e93b09cf7d23012da28f992ff268193397be8f4147e7a8304d6da69bfa174f02" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.281944 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e93b09cf7d23012da28f992ff268193397be8f4147e7a8304d6da69bfa174f02"} err="failed to get container status \"e93b09cf7d23012da28f992ff268193397be8f4147e7a8304d6da69bfa174f02\": rpc error: code = NotFound desc = could not find container \"e93b09cf7d23012da28f992ff268193397be8f4147e7a8304d6da69bfa174f02\": container with ID starting with e93b09cf7d23012da28f992ff268193397be8f4147e7a8304d6da69bfa174f02 not found: ID does not exist" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.621449 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b24eff42-2f16-4313-af4d-87f95e1f8741-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b24eff42-2f16-4313-af4d-87f95e1f8741" (UID: "b24eff42-2f16-4313-af4d-87f95e1f8741"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.672402 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b24eff42-2f16-4313-af4d-87f95e1f8741-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.777972 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t7wpx"] Oct 09 08:53:34 crc kubenswrapper[4715]: I1009 08:53:34.788704 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t7wpx"] Oct 09 08:53:36 crc kubenswrapper[4715]: I1009 08:53:36.146979 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24eff42-2f16-4313-af4d-87f95e1f8741" path="/var/lib/kubelet/pods/b24eff42-2f16-4313-af4d-87f95e1f8741/volumes" Oct 09 08:53:43 crc kubenswrapper[4715]: I1009 08:53:43.137237 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:53:43 crc kubenswrapper[4715]: E1009 08:53:43.138000 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.337523 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w2rgf"] Oct 09 08:53:49 crc kubenswrapper[4715]: E1009 08:53:49.338609 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24eff42-2f16-4313-af4d-87f95e1f8741" containerName="extract-utilities" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.338626 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24eff42-2f16-4313-af4d-87f95e1f8741" containerName="extract-utilities" Oct 09 08:53:49 crc kubenswrapper[4715]: E1009 08:53:49.338665 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24eff42-2f16-4313-af4d-87f95e1f8741" containerName="registry-server" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.338674 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24eff42-2f16-4313-af4d-87f95e1f8741" containerName="registry-server" Oct 09 08:53:49 crc kubenswrapper[4715]: E1009 08:53:49.338694 4715 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24eff42-2f16-4313-af4d-87f95e1f8741" containerName="extract-content" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.338702 4715 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24eff42-2f16-4313-af4d-87f95e1f8741" containerName="extract-content" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.338938 4715 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24eff42-2f16-4313-af4d-87f95e1f8741" containerName="registry-server" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.340697 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.348183 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w2rgf"] Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.479131 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-utilities\") pod \"redhat-marketplace-w2rgf\" (UID: \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\") " pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.479462 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt825\" (UniqueName: \"kubernetes.io/projected/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-kube-api-access-zt825\") pod \"redhat-marketplace-w2rgf\" (UID: \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\") " pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.479679 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-catalog-content\") pod \"redhat-marketplace-w2rgf\" (UID: \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\") " pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.582021 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-catalog-content\") pod \"redhat-marketplace-w2rgf\" (UID: \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\") " pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.582099 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-utilities\") pod \"redhat-marketplace-w2rgf\" (UID: \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\") " pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.582168 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt825\" (UniqueName: \"kubernetes.io/projected/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-kube-api-access-zt825\") pod \"redhat-marketplace-w2rgf\" (UID: \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\") " pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.582985 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-catalog-content\") pod \"redhat-marketplace-w2rgf\" (UID: \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\") " pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.583021 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-utilities\") pod \"redhat-marketplace-w2rgf\" (UID: \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\") " pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.605837 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt825\" (UniqueName: \"kubernetes.io/projected/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-kube-api-access-zt825\") pod \"redhat-marketplace-w2rgf\" (UID: \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\") " pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:49 crc kubenswrapper[4715]: I1009 08:53:49.666771 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:50 crc kubenswrapper[4715]: I1009 08:53:50.083015 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w2rgf"] Oct 09 08:53:50 crc kubenswrapper[4715]: I1009 08:53:50.281318 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w2rgf" event={"ID":"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f","Type":"ContainerStarted","Data":"1a10c8bd4592f659d8fcc8ea75deeb7110a5ddedca86983e25160136076d48a4"} Oct 09 08:53:51 crc kubenswrapper[4715]: I1009 08:53:51.307205 4715 generic.go:334] "Generic (PLEG): container finished" podID="cfcc63ca-ac65-47a6-b0a8-c080ef0a779f" containerID="83a851968a05949880899f196638a3b1440195d40a51703621616651b923eb8a" exitCode=0 Oct 09 08:53:51 crc kubenswrapper[4715]: I1009 08:53:51.308104 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w2rgf" event={"ID":"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f","Type":"ContainerDied","Data":"83a851968a05949880899f196638a3b1440195d40a51703621616651b923eb8a"} Oct 09 08:53:52 crc kubenswrapper[4715]: I1009 08:53:52.905700 4715 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lzm4h"] Oct 09 08:53:52 crc kubenswrapper[4715]: I1009 08:53:52.910625 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:53:52 crc kubenswrapper[4715]: I1009 08:53:52.924259 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzm4h"] Oct 09 08:53:53 crc kubenswrapper[4715]: I1009 08:53:53.053378 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418433fa-2054-43e5-940e-bf3be92f1dad-utilities\") pod \"community-operators-lzm4h\" (UID: \"418433fa-2054-43e5-940e-bf3be92f1dad\") " pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:53:53 crc kubenswrapper[4715]: I1009 08:53:53.054125 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418433fa-2054-43e5-940e-bf3be92f1dad-catalog-content\") pod \"community-operators-lzm4h\" (UID: \"418433fa-2054-43e5-940e-bf3be92f1dad\") " pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:53:53 crc kubenswrapper[4715]: I1009 08:53:53.054189 4715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2crq\" (UniqueName: \"kubernetes.io/projected/418433fa-2054-43e5-940e-bf3be92f1dad-kube-api-access-j2crq\") pod \"community-operators-lzm4h\" (UID: \"418433fa-2054-43e5-940e-bf3be92f1dad\") " pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:53:53 crc kubenswrapper[4715]: I1009 08:53:53.156455 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2crq\" (UniqueName: \"kubernetes.io/projected/418433fa-2054-43e5-940e-bf3be92f1dad-kube-api-access-j2crq\") pod \"community-operators-lzm4h\" (UID: \"418433fa-2054-43e5-940e-bf3be92f1dad\") " pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:53:53 crc kubenswrapper[4715]: I1009 08:53:53.156634 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418433fa-2054-43e5-940e-bf3be92f1dad-utilities\") pod \"community-operators-lzm4h\" (UID: \"418433fa-2054-43e5-940e-bf3be92f1dad\") " pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:53:53 crc kubenswrapper[4715]: I1009 08:53:53.156825 4715 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418433fa-2054-43e5-940e-bf3be92f1dad-catalog-content\") pod \"community-operators-lzm4h\" (UID: \"418433fa-2054-43e5-940e-bf3be92f1dad\") " pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:53:53 crc kubenswrapper[4715]: I1009 08:53:53.157862 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418433fa-2054-43e5-940e-bf3be92f1dad-catalog-content\") pod \"community-operators-lzm4h\" (UID: \"418433fa-2054-43e5-940e-bf3be92f1dad\") " pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:53:53 crc kubenswrapper[4715]: I1009 08:53:53.160108 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418433fa-2054-43e5-940e-bf3be92f1dad-utilities\") pod \"community-operators-lzm4h\" (UID: \"418433fa-2054-43e5-940e-bf3be92f1dad\") " pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:53:53 crc kubenswrapper[4715]: I1009 08:53:53.197218 4715 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2crq\" (UniqueName: \"kubernetes.io/projected/418433fa-2054-43e5-940e-bf3be92f1dad-kube-api-access-j2crq\") pod \"community-operators-lzm4h\" (UID: \"418433fa-2054-43e5-940e-bf3be92f1dad\") " pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:53:53 crc kubenswrapper[4715]: I1009 08:53:53.239948 4715 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:53:53 crc kubenswrapper[4715]: I1009 08:53:53.346091 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w2rgf" event={"ID":"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f","Type":"ContainerStarted","Data":"619f4e852609c58cf0c12845a545a1831503529a7985778eef8469958805adf1"} Oct 09 08:53:53 crc kubenswrapper[4715]: I1009 08:53:53.793271 4715 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzm4h"] Oct 09 08:53:53 crc kubenswrapper[4715]: W1009 08:53:53.805350 4715 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod418433fa_2054_43e5_940e_bf3be92f1dad.slice/crio-846a7aff3836a0024faacf5dd086eb875c4fba5ecff78baf829a17d4594f2081 WatchSource:0}: Error finding container 846a7aff3836a0024faacf5dd086eb875c4fba5ecff78baf829a17d4594f2081: Status 404 returned error can't find the container with id 846a7aff3836a0024faacf5dd086eb875c4fba5ecff78baf829a17d4594f2081 Oct 09 08:53:54 crc kubenswrapper[4715]: I1009 08:53:54.372008 4715 generic.go:334] "Generic (PLEG): container finished" podID="cfcc63ca-ac65-47a6-b0a8-c080ef0a779f" containerID="619f4e852609c58cf0c12845a545a1831503529a7985778eef8469958805adf1" exitCode=0 Oct 09 08:53:54 crc kubenswrapper[4715]: I1009 08:53:54.372058 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w2rgf" event={"ID":"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f","Type":"ContainerDied","Data":"619f4e852609c58cf0c12845a545a1831503529a7985778eef8469958805adf1"} Oct 09 08:53:54 crc kubenswrapper[4715]: I1009 08:53:54.374754 4715 generic.go:334] "Generic (PLEG): container finished" podID="418433fa-2054-43e5-940e-bf3be92f1dad" containerID="7294eae0ec8271d5dd899ab0d5dec2670d647626d15c2f96ec9ac36353e9dc10" exitCode=0 Oct 09 08:53:54 crc kubenswrapper[4715]: I1009 08:53:54.374795 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzm4h" event={"ID":"418433fa-2054-43e5-940e-bf3be92f1dad","Type":"ContainerDied","Data":"7294eae0ec8271d5dd899ab0d5dec2670d647626d15c2f96ec9ac36353e9dc10"} Oct 09 08:53:54 crc kubenswrapper[4715]: I1009 08:53:54.374821 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzm4h" event={"ID":"418433fa-2054-43e5-940e-bf3be92f1dad","Type":"ContainerStarted","Data":"846a7aff3836a0024faacf5dd086eb875c4fba5ecff78baf829a17d4594f2081"} Oct 09 08:53:55 crc kubenswrapper[4715]: I1009 08:53:55.385935 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w2rgf" event={"ID":"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f","Type":"ContainerStarted","Data":"252316ca8bb1939a3c0fb6c64dda04837f7c3c144ba147c4033e118a3f90e02d"} Oct 09 08:53:55 crc kubenswrapper[4715]: I1009 08:53:55.412137 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w2rgf" podStartSLOduration=2.738159801 podStartE2EDuration="6.412111756s" podCreationTimestamp="2025-10-09 08:53:49 +0000 UTC" firstStartedPulling="2025-10-09 08:53:51.312942557 +0000 UTC m=+4062.005746565" lastFinishedPulling="2025-10-09 08:53:54.986894512 +0000 UTC m=+4065.679698520" observedRunningTime="2025-10-09 08:53:55.403908931 +0000 UTC m=+4066.096712939" watchObservedRunningTime="2025-10-09 08:53:55.412111756 +0000 UTC m=+4066.104915764" Oct 09 08:53:56 crc kubenswrapper[4715]: I1009 08:53:56.137480 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:53:56 crc kubenswrapper[4715]: E1009 08:53:56.137909 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:53:56 crc kubenswrapper[4715]: I1009 08:53:56.395911 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzm4h" event={"ID":"418433fa-2054-43e5-940e-bf3be92f1dad","Type":"ContainerStarted","Data":"1b7679de45994885b0b3ee3e7deb70a46aaceef228dc5706d67208184893482a"} Oct 09 08:53:57 crc kubenswrapper[4715]: I1009 08:53:57.406744 4715 generic.go:334] "Generic (PLEG): container finished" podID="418433fa-2054-43e5-940e-bf3be92f1dad" containerID="1b7679de45994885b0b3ee3e7deb70a46aaceef228dc5706d67208184893482a" exitCode=0 Oct 09 08:53:57 crc kubenswrapper[4715]: I1009 08:53:57.407061 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzm4h" event={"ID":"418433fa-2054-43e5-940e-bf3be92f1dad","Type":"ContainerDied","Data":"1b7679de45994885b0b3ee3e7deb70a46aaceef228dc5706d67208184893482a"} Oct 09 08:53:59 crc kubenswrapper[4715]: I1009 08:53:59.432253 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzm4h" event={"ID":"418433fa-2054-43e5-940e-bf3be92f1dad","Type":"ContainerStarted","Data":"17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e"} Oct 09 08:53:59 crc kubenswrapper[4715]: I1009 08:53:59.667474 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:59 crc kubenswrapper[4715]: I1009 08:53:59.667994 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:59 crc kubenswrapper[4715]: I1009 08:53:59.721979 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:53:59 crc kubenswrapper[4715]: I1009 08:53:59.746222 4715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lzm4h" podStartSLOduration=3.5043882870000003 podStartE2EDuration="7.74620541s" podCreationTimestamp="2025-10-09 08:53:52 +0000 UTC" firstStartedPulling="2025-10-09 08:53:54.377020201 +0000 UTC m=+4065.069824209" lastFinishedPulling="2025-10-09 08:53:58.618837284 +0000 UTC m=+4069.311641332" observedRunningTime="2025-10-09 08:53:59.456782975 +0000 UTC m=+4070.149586993" watchObservedRunningTime="2025-10-09 08:53:59.74620541 +0000 UTC m=+4070.439009408" Oct 09 08:54:00 crc kubenswrapper[4715]: I1009 08:54:00.498430 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:54:01 crc kubenswrapper[4715]: I1009 08:54:01.895673 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w2rgf"] Oct 09 08:54:03 crc kubenswrapper[4715]: I1009 08:54:03.241295 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:54:03 crc kubenswrapper[4715]: I1009 08:54:03.241581 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:54:03 crc kubenswrapper[4715]: I1009 08:54:03.468271 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w2rgf" podUID="cfcc63ca-ac65-47a6-b0a8-c080ef0a779f" containerName="registry-server" containerID="cri-o://252316ca8bb1939a3c0fb6c64dda04837f7c3c144ba147c4033e118a3f90e02d" gracePeriod=2 Oct 09 08:54:03 crc kubenswrapper[4715]: I1009 08:54:03.841206 4715 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:54:04 crc kubenswrapper[4715]: I1009 08:54:04.478893 4715 generic.go:334] "Generic (PLEG): container finished" podID="cfcc63ca-ac65-47a6-b0a8-c080ef0a779f" containerID="252316ca8bb1939a3c0fb6c64dda04837f7c3c144ba147c4033e118a3f90e02d" exitCode=0 Oct 09 08:54:04 crc kubenswrapper[4715]: I1009 08:54:04.478940 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w2rgf" event={"ID":"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f","Type":"ContainerDied","Data":"252316ca8bb1939a3c0fb6c64dda04837f7c3c144ba147c4033e118a3f90e02d"} Oct 09 08:54:04 crc kubenswrapper[4715]: I1009 08:54:04.833206 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:54:04 crc kubenswrapper[4715]: I1009 08:54:04.981019 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-utilities\") pod \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\" (UID: \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\") " Oct 09 08:54:04 crc kubenswrapper[4715]: I1009 08:54:04.981131 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-catalog-content\") pod \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\" (UID: \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\") " Oct 09 08:54:04 crc kubenswrapper[4715]: I1009 08:54:04.981266 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt825\" (UniqueName: \"kubernetes.io/projected/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-kube-api-access-zt825\") pod \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\" (UID: \"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f\") " Oct 09 08:54:04 crc kubenswrapper[4715]: I1009 08:54:04.982106 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-utilities" (OuterVolumeSpecName: "utilities") pod "cfcc63ca-ac65-47a6-b0a8-c080ef0a779f" (UID: "cfcc63ca-ac65-47a6-b0a8-c080ef0a779f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:54:04 crc kubenswrapper[4715]: I1009 08:54:04.990694 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-kube-api-access-zt825" (OuterVolumeSpecName: "kube-api-access-zt825") pod "cfcc63ca-ac65-47a6-b0a8-c080ef0a779f" (UID: "cfcc63ca-ac65-47a6-b0a8-c080ef0a779f"). InnerVolumeSpecName "kube-api-access-zt825". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:54:04 crc kubenswrapper[4715]: I1009 08:54:04.996124 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfcc63ca-ac65-47a6-b0a8-c080ef0a779f" (UID: "cfcc63ca-ac65-47a6-b0a8-c080ef0a779f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:54:05 crc kubenswrapper[4715]: I1009 08:54:05.084213 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:54:05 crc kubenswrapper[4715]: I1009 08:54:05.084464 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:54:05 crc kubenswrapper[4715]: I1009 08:54:05.084557 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt825\" (UniqueName: \"kubernetes.io/projected/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f-kube-api-access-zt825\") on node \"crc\" DevicePath \"\"" Oct 09 08:54:05 crc kubenswrapper[4715]: I1009 08:54:05.490110 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w2rgf" event={"ID":"cfcc63ca-ac65-47a6-b0a8-c080ef0a779f","Type":"ContainerDied","Data":"1a10c8bd4592f659d8fcc8ea75deeb7110a5ddedca86983e25160136076d48a4"} Oct 09 08:54:05 crc kubenswrapper[4715]: I1009 08:54:05.490162 4715 scope.go:117] "RemoveContainer" containerID="252316ca8bb1939a3c0fb6c64dda04837f7c3c144ba147c4033e118a3f90e02d" Oct 09 08:54:05 crc kubenswrapper[4715]: I1009 08:54:05.490310 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w2rgf" Oct 09 08:54:05 crc kubenswrapper[4715]: I1009 08:54:05.520789 4715 scope.go:117] "RemoveContainer" containerID="619f4e852609c58cf0c12845a545a1831503529a7985778eef8469958805adf1" Oct 09 08:54:05 crc kubenswrapper[4715]: I1009 08:54:05.529691 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w2rgf"] Oct 09 08:54:05 crc kubenswrapper[4715]: I1009 08:54:05.539001 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w2rgf"] Oct 09 08:54:05 crc kubenswrapper[4715]: I1009 08:54:05.552830 4715 scope.go:117] "RemoveContainer" containerID="83a851968a05949880899f196638a3b1440195d40a51703621616651b923eb8a" Oct 09 08:54:06 crc kubenswrapper[4715]: I1009 08:54:06.148477 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfcc63ca-ac65-47a6-b0a8-c080ef0a779f" path="/var/lib/kubelet/pods/cfcc63ca-ac65-47a6-b0a8-c080ef0a779f/volumes" Oct 09 08:54:06 crc kubenswrapper[4715]: I1009 08:54:06.503129 4715 generic.go:334] "Generic (PLEG): container finished" podID="6cd0920b-496d-4531-aeaa-ea492e0cdbb4" containerID="67b7fb61ac0127a891f825744cc58d7904a39efc631950b640c773eb30b1cf6c" exitCode=0 Oct 09 08:54:06 crc kubenswrapper[4715]: I1009 08:54:06.503216 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-49qll/must-gather-9bc7r" event={"ID":"6cd0920b-496d-4531-aeaa-ea492e0cdbb4","Type":"ContainerDied","Data":"67b7fb61ac0127a891f825744cc58d7904a39efc631950b640c773eb30b1cf6c"} Oct 09 08:54:06 crc kubenswrapper[4715]: I1009 08:54:06.504205 4715 scope.go:117] "RemoveContainer" containerID="67b7fb61ac0127a891f825744cc58d7904a39efc631950b640c773eb30b1cf6c" Oct 09 08:54:06 crc kubenswrapper[4715]: I1009 08:54:06.771975 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-49qll_must-gather-9bc7r_6cd0920b-496d-4531-aeaa-ea492e0cdbb4/gather/0.log" Oct 09 08:54:10 crc kubenswrapper[4715]: I1009 08:54:10.147138 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:54:10 crc kubenswrapper[4715]: E1009 08:54:10.147583 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:54:13 crc kubenswrapper[4715]: I1009 08:54:13.291165 4715 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:54:13 crc kubenswrapper[4715]: I1009 08:54:13.346352 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzm4h"] Oct 09 08:54:13 crc kubenswrapper[4715]: I1009 08:54:13.573358 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lzm4h" podUID="418433fa-2054-43e5-940e-bf3be92f1dad" containerName="registry-server" containerID="cri-o://17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e" gracePeriod=2 Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.073852 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.169276 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418433fa-2054-43e5-940e-bf3be92f1dad-utilities\") pod \"418433fa-2054-43e5-940e-bf3be92f1dad\" (UID: \"418433fa-2054-43e5-940e-bf3be92f1dad\") " Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.169345 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2crq\" (UniqueName: \"kubernetes.io/projected/418433fa-2054-43e5-940e-bf3be92f1dad-kube-api-access-j2crq\") pod \"418433fa-2054-43e5-940e-bf3be92f1dad\" (UID: \"418433fa-2054-43e5-940e-bf3be92f1dad\") " Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.169459 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418433fa-2054-43e5-940e-bf3be92f1dad-catalog-content\") pod \"418433fa-2054-43e5-940e-bf3be92f1dad\" (UID: \"418433fa-2054-43e5-940e-bf3be92f1dad\") " Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.170290 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/418433fa-2054-43e5-940e-bf3be92f1dad-utilities" (OuterVolumeSpecName: "utilities") pod "418433fa-2054-43e5-940e-bf3be92f1dad" (UID: "418433fa-2054-43e5-940e-bf3be92f1dad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.174982 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/418433fa-2054-43e5-940e-bf3be92f1dad-kube-api-access-j2crq" (OuterVolumeSpecName: "kube-api-access-j2crq") pod "418433fa-2054-43e5-940e-bf3be92f1dad" (UID: "418433fa-2054-43e5-940e-bf3be92f1dad"). InnerVolumeSpecName "kube-api-access-j2crq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.213213 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/418433fa-2054-43e5-940e-bf3be92f1dad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "418433fa-2054-43e5-940e-bf3be92f1dad" (UID: "418433fa-2054-43e5-940e-bf3be92f1dad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.272161 4715 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/418433fa-2054-43e5-940e-bf3be92f1dad-utilities\") on node \"crc\" DevicePath \"\"" Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.272225 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2crq\" (UniqueName: \"kubernetes.io/projected/418433fa-2054-43e5-940e-bf3be92f1dad-kube-api-access-j2crq\") on node \"crc\" DevicePath \"\"" Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.272248 4715 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/418433fa-2054-43e5-940e-bf3be92f1dad-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.586974 4715 generic.go:334] "Generic (PLEG): container finished" podID="418433fa-2054-43e5-940e-bf3be92f1dad" containerID="17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e" exitCode=0 Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.587049 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzm4h" Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.587097 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzm4h" event={"ID":"418433fa-2054-43e5-940e-bf3be92f1dad","Type":"ContainerDied","Data":"17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e"} Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.587549 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzm4h" event={"ID":"418433fa-2054-43e5-940e-bf3be92f1dad","Type":"ContainerDied","Data":"846a7aff3836a0024faacf5dd086eb875c4fba5ecff78baf829a17d4594f2081"} Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.587575 4715 scope.go:117] "RemoveContainer" containerID="17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e" Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.615337 4715 scope.go:117] "RemoveContainer" containerID="1b7679de45994885b0b3ee3e7deb70a46aaceef228dc5706d67208184893482a" Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.627119 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzm4h"] Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.634173 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lzm4h"] Oct 09 08:54:14 crc kubenswrapper[4715]: I1009 08:54:14.917045 4715 scope.go:117] "RemoveContainer" containerID="7294eae0ec8271d5dd899ab0d5dec2670d647626d15c2f96ec9ac36353e9dc10" Oct 09 08:54:15 crc kubenswrapper[4715]: I1009 08:54:15.052721 4715 scope.go:117] "RemoveContainer" containerID="17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e" Oct 09 08:54:15 crc kubenswrapper[4715]: E1009 08:54:15.053161 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e\": container with ID starting with 17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e not found: ID does not exist" containerID="17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e" Oct 09 08:54:15 crc kubenswrapper[4715]: I1009 08:54:15.053204 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e"} err="failed to get container status \"17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e\": rpc error: code = NotFound desc = could not find container \"17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e\": container with ID starting with 17dd169ea427b9e81dbbf5b52871223ce1fc944f4e319c592bdb7b13eab7159e not found: ID does not exist" Oct 09 08:54:15 crc kubenswrapper[4715]: I1009 08:54:15.053230 4715 scope.go:117] "RemoveContainer" containerID="1b7679de45994885b0b3ee3e7deb70a46aaceef228dc5706d67208184893482a" Oct 09 08:54:15 crc kubenswrapper[4715]: E1009 08:54:15.053887 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7679de45994885b0b3ee3e7deb70a46aaceef228dc5706d67208184893482a\": container with ID starting with 1b7679de45994885b0b3ee3e7deb70a46aaceef228dc5706d67208184893482a not found: ID does not exist" containerID="1b7679de45994885b0b3ee3e7deb70a46aaceef228dc5706d67208184893482a" Oct 09 08:54:15 crc kubenswrapper[4715]: I1009 08:54:15.053943 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7679de45994885b0b3ee3e7deb70a46aaceef228dc5706d67208184893482a"} err="failed to get container status \"1b7679de45994885b0b3ee3e7deb70a46aaceef228dc5706d67208184893482a\": rpc error: code = NotFound desc = could not find container \"1b7679de45994885b0b3ee3e7deb70a46aaceef228dc5706d67208184893482a\": container with ID starting with 1b7679de45994885b0b3ee3e7deb70a46aaceef228dc5706d67208184893482a not found: ID does not exist" Oct 09 08:54:15 crc kubenswrapper[4715]: I1009 08:54:15.053965 4715 scope.go:117] "RemoveContainer" containerID="7294eae0ec8271d5dd899ab0d5dec2670d647626d15c2f96ec9ac36353e9dc10" Oct 09 08:54:15 crc kubenswrapper[4715]: E1009 08:54:15.054292 4715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7294eae0ec8271d5dd899ab0d5dec2670d647626d15c2f96ec9ac36353e9dc10\": container with ID starting with 7294eae0ec8271d5dd899ab0d5dec2670d647626d15c2f96ec9ac36353e9dc10 not found: ID does not exist" containerID="7294eae0ec8271d5dd899ab0d5dec2670d647626d15c2f96ec9ac36353e9dc10" Oct 09 08:54:15 crc kubenswrapper[4715]: I1009 08:54:15.054315 4715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7294eae0ec8271d5dd899ab0d5dec2670d647626d15c2f96ec9ac36353e9dc10"} err="failed to get container status \"7294eae0ec8271d5dd899ab0d5dec2670d647626d15c2f96ec9ac36353e9dc10\": rpc error: code = NotFound desc = could not find container \"7294eae0ec8271d5dd899ab0d5dec2670d647626d15c2f96ec9ac36353e9dc10\": container with ID starting with 7294eae0ec8271d5dd899ab0d5dec2670d647626d15c2f96ec9ac36353e9dc10 not found: ID does not exist" Oct 09 08:54:16 crc kubenswrapper[4715]: I1009 08:54:16.152659 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="418433fa-2054-43e5-940e-bf3be92f1dad" path="/var/lib/kubelet/pods/418433fa-2054-43e5-940e-bf3be92f1dad/volumes" Oct 09 08:54:16 crc kubenswrapper[4715]: I1009 08:54:16.442696 4715 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-49qll/must-gather-9bc7r"] Oct 09 08:54:16 crc kubenswrapper[4715]: I1009 08:54:16.442987 4715 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-49qll/must-gather-9bc7r" podUID="6cd0920b-496d-4531-aeaa-ea492e0cdbb4" containerName="copy" containerID="cri-o://aac398f92dd8f424b0c12d756d6385a67ee8f3c054e25091272b4d83f5058375" gracePeriod=2 Oct 09 08:54:16 crc kubenswrapper[4715]: I1009 08:54:16.450866 4715 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-49qll/must-gather-9bc7r"] Oct 09 08:54:16 crc kubenswrapper[4715]: I1009 08:54:16.620284 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-49qll_must-gather-9bc7r_6cd0920b-496d-4531-aeaa-ea492e0cdbb4/copy/0.log" Oct 09 08:54:16 crc kubenswrapper[4715]: I1009 08:54:16.621020 4715 generic.go:334] "Generic (PLEG): container finished" podID="6cd0920b-496d-4531-aeaa-ea492e0cdbb4" containerID="aac398f92dd8f424b0c12d756d6385a67ee8f3c054e25091272b4d83f5058375" exitCode=143 Oct 09 08:54:16 crc kubenswrapper[4715]: I1009 08:54:16.896990 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-49qll_must-gather-9bc7r_6cd0920b-496d-4531-aeaa-ea492e0cdbb4/copy/0.log" Oct 09 08:54:16 crc kubenswrapper[4715]: I1009 08:54:16.897949 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/must-gather-9bc7r" Oct 09 08:54:17 crc kubenswrapper[4715]: I1009 08:54:17.020860 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc754\" (UniqueName: \"kubernetes.io/projected/6cd0920b-496d-4531-aeaa-ea492e0cdbb4-kube-api-access-dc754\") pod \"6cd0920b-496d-4531-aeaa-ea492e0cdbb4\" (UID: \"6cd0920b-496d-4531-aeaa-ea492e0cdbb4\") " Oct 09 08:54:17 crc kubenswrapper[4715]: I1009 08:54:17.020918 4715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cd0920b-496d-4531-aeaa-ea492e0cdbb4-must-gather-output\") pod \"6cd0920b-496d-4531-aeaa-ea492e0cdbb4\" (UID: \"6cd0920b-496d-4531-aeaa-ea492e0cdbb4\") " Oct 09 08:54:17 crc kubenswrapper[4715]: I1009 08:54:17.026085 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd0920b-496d-4531-aeaa-ea492e0cdbb4-kube-api-access-dc754" (OuterVolumeSpecName: "kube-api-access-dc754") pod "6cd0920b-496d-4531-aeaa-ea492e0cdbb4" (UID: "6cd0920b-496d-4531-aeaa-ea492e0cdbb4"). InnerVolumeSpecName "kube-api-access-dc754". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 09 08:54:17 crc kubenswrapper[4715]: I1009 08:54:17.122759 4715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc754\" (UniqueName: \"kubernetes.io/projected/6cd0920b-496d-4531-aeaa-ea492e0cdbb4-kube-api-access-dc754\") on node \"crc\" DevicePath \"\"" Oct 09 08:54:17 crc kubenswrapper[4715]: I1009 08:54:17.171197 4715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd0920b-496d-4531-aeaa-ea492e0cdbb4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6cd0920b-496d-4531-aeaa-ea492e0cdbb4" (UID: "6cd0920b-496d-4531-aeaa-ea492e0cdbb4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 09 08:54:17 crc kubenswrapper[4715]: I1009 08:54:17.224748 4715 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cd0920b-496d-4531-aeaa-ea492e0cdbb4-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 09 08:54:17 crc kubenswrapper[4715]: I1009 08:54:17.633718 4715 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-49qll_must-gather-9bc7r_6cd0920b-496d-4531-aeaa-ea492e0cdbb4/copy/0.log" Oct 09 08:54:17 crc kubenswrapper[4715]: I1009 08:54:17.634195 4715 scope.go:117] "RemoveContainer" containerID="aac398f92dd8f424b0c12d756d6385a67ee8f3c054e25091272b4d83f5058375" Oct 09 08:54:17 crc kubenswrapper[4715]: I1009 08:54:17.634431 4715 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-49qll/must-gather-9bc7r" Oct 09 08:54:17 crc kubenswrapper[4715]: I1009 08:54:17.655207 4715 scope.go:117] "RemoveContainer" containerID="67b7fb61ac0127a891f825744cc58d7904a39efc631950b640c773eb30b1cf6c" Oct 09 08:54:18 crc kubenswrapper[4715]: I1009 08:54:18.149642 4715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd0920b-496d-4531-aeaa-ea492e0cdbb4" path="/var/lib/kubelet/pods/6cd0920b-496d-4531-aeaa-ea492e0cdbb4/volumes" Oct 09 08:54:25 crc kubenswrapper[4715]: I1009 08:54:25.137409 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:54:25 crc kubenswrapper[4715]: E1009 08:54:25.138296 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:54:38 crc kubenswrapper[4715]: I1009 08:54:38.136751 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:54:38 crc kubenswrapper[4715]: E1009 08:54:38.137769 4715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7vwx_openshift-machine-config-operator(acafd807-8875-4b4f-aba9-4f807ca336e7)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" podUID="acafd807-8875-4b4f-aba9-4f807ca336e7" Oct 09 08:54:53 crc kubenswrapper[4715]: I1009 08:54:53.136567 4715 scope.go:117] "RemoveContainer" containerID="3cecc05ede7162882b376373801e9d44deb7deec541f11babf1e9925da0c5cf7" Oct 09 08:54:53 crc kubenswrapper[4715]: I1009 08:54:53.955851 4715 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7vwx" event={"ID":"acafd807-8875-4b4f-aba9-4f807ca336e7","Type":"ContainerStarted","Data":"925146068c07f7c2a2fce766f3646cca1a7434a272afc836e7fe9adc92b8e573"}